Approximate Bayes Seminar's Avatar

Approximate Bayes Seminar

@approxbayesseminar

Posting about the One World Approximate Bayesian Inference (ABI) Seminar, details at https://warwick.ac.uk/fac/sci/statistics/news/upcoming-seminars/abcworldseminar/

407
Followers
21
Following
57
Posts
23.10.2024
Joined
Posts Following

Latest posts by Approximate Bayes Seminar @approxbayesseminar

Preview
Robust Simulation Based Inference Simulation-Based Inference (SBI) is an approach to statistical inference where simulations from an assumed model are used to construct estimators and confidence sets. SBI is often used when the...

Finally, we propose two ideas that are useful in the SBI framework beyond robust inference: an SBI based method to obtain closed form approximations of intractable models and an active learning approach to more efficiently sample the parameter space.

09.03.2026 08:59 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

We also develop an SBI based goodness-of-fit test to detect model misspecification.

09.03.2026 08:59 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

The method guarantees valid inference, even when the model is incorrectly specified and even if the standard regularity conditions fail. Alternatively, we introduce model expansion through exponential tilting as another way to account for model misspecification.

09.03.2026 08:59 ๐Ÿ‘ 1 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

This paper introduces robust methods that allow for valid frequentist inference in the presence of model misspecification. We propose a framework where the target of inference is a projection parameter that minimizes a discrepancy between the true distribution and the assumed model.

09.03.2026 08:59 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

SBI is often used when the likelihood is intractable and to construct confidence sets that do not rely on asymptotic methods or regularity conditions. Traditional SBI methods assume that the model is correct, but, as always, this can lead to invalid inference when the model is misspecified.

09.03.2026 08:59 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Abstract: Simulation-Based Inference (SBI) is an approach to statistical inference where simulations from an assumed model are used to construct estimators and confidence sets.

09.03.2026 08:59 ๐Ÿ‘ 1 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

OWABIโท, 25 March 2026: Robust Simulation Based Inference (11am UK time)
Speaker: Larry Wasserman (Carnegie Mellon University)
Title: Robust Simulation Based Inference
warwick.ac.uk/fac/sc...

09.03.2026 08:59 ๐Ÿ‘ 3 ๐Ÿ” 1 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 1

We validate our method on both a synthetic multi-dimensional time series and a real-world meteorological dataset; highlighting its practical utility for data assimilation for complex dynamical systems.

23.02.2026 09:16 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

performance. For scalable inference, we employ easily parallelizable wastefree sequential Monte Carlo (SMC) samplers with preconditioned gradient-based kernels, enabling efficient exploration of high-dimensional parameter spaces such as those in DGFMs.

23.02.2026 09:16 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Since the true data-generating process often lies outside the assumed model class, we adopt an alternative notion of consistency and prove that, under mild conditions, both the prequential loss minimizer and the prequential posterior concentrate around parameters with optimal predictive

23.02.2026 09:16 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

To overcome this, we introduce prequential posteriors, based upon a predictive-sequential (prequential) loss function; an approach naturally suited for temporally dependent data which is the focus of forecasting tasks.

23.02.2026 09:16 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Deep generative forecasting models (DGFMs) have shown excellent performance in these areas, but assimilating data into such models is challenging due to their intractable likelihood functions. This limitation restricts the use of standard Bayesian data assimilation methodologies for DGFMs.

23.02.2026 09:16 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Abstract: Data assimilation is a fundamental task in updating forecasting models upon observing new data, with applications ranging from weather prediction to online reinforcement learning.

23.02.2026 09:15 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0
Preview
Prequential posteriors Data assimilation is a fundamental task in updating forecasting models upon observing new data, with applications ranging from weather prediction to online reinforcement learning. Deep generative...

The next OWABI seminar will be given by Shreya Sinha Roy (Warwick), who will talk about "Prequential posteriors" on Thursday the 26th February at 11am UK time.
arxiv.org/abs/2511.1...
www.warwick.ac.uk/owabi
MS Teams link: teams.microsoft.com/...

23.02.2026 09:15 ๐Ÿ‘ 7 ๐Ÿ” 4 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

or superior performance to existing state-of-the-art methodssuch as Sequential Neural Posterior Estimation (SNPE).
Ref: Sharrock, Simons, Liu, Beaumont, Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models. PLMR. openreview.net/pdf?i...

28.01.2026 19:15 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

the observation of interest, thereby reducing the simulation cost. We also introduce several alternative sequential approaches, and discuss their relative merits. We then validate our method, as well as its amortised, non-sequential, variant on several numerical examples, demonstrating comparable

28.01.2026 19:15 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

generate samples from the posterior distribution of interest. The model is trained using an objective function which directly estimates the score of the posterior. We embed the model into a sequential training procedure, which guides simulations using the current approximation of the posterior at

28.01.2026 19:15 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Abstract: We introduce Sequential Neural Posterior Score Estimation (SNPSE), a score-based method for Bayesian inference in simulator-based models. Our method, inspired by the remarkable success of score-based methods in generative modelling, leverages conditional score-based diffusion models to

28.01.2026 19:15 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Tomorrow's OWABI seminar by Louis Sharrock (UCL), who will talk about "Sequential Neural Score Estimation: Likelihood-free inference with conditional score base diffusion models" (29th Jan, 11am UK time). Join: teams.microsoft.com/...
Meeting ID: 388 990 396 214 49
Passcode: f3Dh22XL

28.01.2026 19:15 ๐Ÿ‘ 4 ๐Ÿ” 3 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0
Preview
Approximate Bayesian Computation with Statistical Distances for... Model selection is a key task in statistics, playing a critical role across various scientific disciplines. While no model can fully capture the complexities of a real-world data-generating...

Through simulation studies and an application to toad movement models, this work explores whether full data approaches can overcome the limitations of summary statistic-based ABC for model choice.

Paper: arxiv.org/abs/2410.2...

Teams details:
Meeting ID: 365 183 408 357 76
Passcode: 6Fg9nW7T

17.11.2025 11:50 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

Despite these developments, full data ABC approaches have not yet been widely applied to model selection problems. This paper seeks to address this gap by investigating the performance of ABC with statistical distances in model selection.

17.11.2025 11:50 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Recent advancements propose the use of full data approaches based on statistical distances, offering a promising alternative that bypasses the need for handcrafted summary statistics and can yield posterior approximations that more closely reflect the true posterior under suitable conditions.

17.11.2025 11:49 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Approximate Bayesian computation (ABC) has emerged as a likelihood-free method and it is traditionally used with summary statistics to reduce data dimensionality, however this often results in information loss difficult to quantify, particularly in model selection contexts.

17.11.2025 11:49 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

This is typically achieved by calculating posterior probabilities, which quantify the support for each model given the observed data. However, in cases where likelihood functions are intractable, exact computation of these posterior probabilities becomes infeasible.

17.11.2025 11:49 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Bayesian statistics offers a flexible framework for model selection by updating prior beliefs as new data becomes available, allowing for ongoing refinement of candidate models.

17.11.2025 11:49 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Abstract: Model selection is a key task in statistics, playing a critical role across various scientific disciplines. While no model can fully capture the complexities of a real-world data-generating process, identifying the model that best approximates it can provide valuable insights.

17.11.2025 11:49 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

The next OWABI seminar www.warwick.ac.uk/owabi is quickly approaching!

Our next speaker is Clara Grazian (Sydney) www.sydney.edu.au/sc..., who will talk about "Approximate Bayesian Computation with Statistical Distances for Model Selection" on 27th Nov, 11am UK time.

17.11.2025 11:49 ๐Ÿ‘ 1 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

The recording of my talk on 'Multilevel neural simulation-based inference' at the 'One World Approximate Bayesian Inference' seminar series is now available on YouTube.

Link: www.youtube.com/watch?v=hBWd...

17.11.2025 09:40 ๐Ÿ‘ 9 ๐Ÿ” 4 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

We demonstrate through both theoretical analysis and extensive experiments that our method can significantly enhance the accuracy of SBI methods given a fixed computational budget.

The talk will be streamed on MS Teams:
Meeting ID: 358 173 458 006 0
Passcode: Vp2975vC

27.10.2025 12:06 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

In this paper, we propose a novel approach to neural SBI which leverages multilevel Monte Carlo techniques for settings where several simulators of varying cost and fidelity are available.

27.10.2025 12:06 ๐Ÿ‘ 0 ๐Ÿ” 0 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0