8/n I am at #NeurIPS2025 this week. Our poster session is Wed Dec 3, 11:00 - 14:00. Come say hi if youโd like to chat about amortized inference!
8/n I am at #NeurIPS2025 this week. Our poster session is Wed Dec 3, 11:00 - 14:00. Come say hi if youโd like to chat about amortized inference!
7/n Joint work with @wenxinyi.bsky.social @ayushbharti.bsky.social @samikaski.bsky.social @lacerbi.bsky.social and supported by ELLIS Institute Finland, FCAI, Aalto University, University of Helsinki. Huge thanks for the great environment and support.
6/n Experiments on active learning benchmarks, classical BED tasks (location-finding, CES), and a psychometric model show ALINE achieves fast, accurate inference together with efficient and flexible data acquisition.
5/n We train ALINE with reinforcement learning using a self-estimated information-gain reward, so the model learns to ask the questions that most improve its own beliefs.
4/n ALINE is built on a Transformer neural process with two specialized heads: an inference head that amortizes the posterior & predictive, and a policy head that proposes the next query point given the current dataset.
3/n In this paper, we propose ALINE, which can
โข instantly selects the next informative data point to query
โข performs amortized Bayesian inference over parameters & predictions
โข switches targets at runtime without retraining.
2/n Many tasks, from scientific discovery to medical diagnosis, require us to rapidly decide which data to acquire next to maximize our knowledge, and also rapidly learn about the unknown factors based on the collected data. Existing methods usually focus on amortizing one thing.
We jointly amortize Bayesian inference and active data acquisition within a single architecture.
Excited to share our #NeurIPS2025 โจSpotlightโจ paper โALINE: Joint Amortization for Bayesian Inference and Active Data Acquisitionโ!
www.huangdaolang.com/aline/
@lacerbi.bsky.social wrote a very nice summary post of our paper here if anyone missed it:
bsky.app/profile/lace...
I can give some more behind-the-scenes information. ๐งต
1/ Introducing ACE (Amortized Conditioning Engine)! Our new AISTATS 2025 paper presents a transformer framework that unifies tasks from image completion to BayesOpt & simulator-based inference under *one* probabilistic conditioning approach. It's Bayes all the way down!
Interested in amortization + experimental design + decision making? #NeurIPS2024
Come by to our poster, starting soon (11-2, East Hall)!
NeurIPS link: neurips.cc/virtual/2024...
Paper: openreview.net/forum?id=zBG...
with @huangdaolang.bsky.social Yujia Guo @samikaski.bsky.social
I am at #NeurIPS2024 and looking for postdocs. Feel free to reach out if you want to discuss!
We have also other positions: fcai.fi/winter-calls.... My slightly outdated home page is: kaski-lab.com
1/ Hi all, I am at #NeurIPS2024 and I will be hiring a postdoc in probabilistic machine learning starting asap.
Research interests: amortized, approximate & simulator-based inference, Bayesian optimization, and AI4science.
Get in touch for a chat or come to our posters today 11AM or Friday 11AM!
Great list! Can I join?
8/ Join us at our poster session at #NeurIPS2024. Unfortunately this year I canโt attend in person, but @lacerbi.bsky.social will present our work. We are excited to discuss and explore future directions in Bayesian experimental design and amortization!
7/ Experiments show TNDP significantly outperforms traditional methods across various tasks, including targeted active learning and hyperparameter optimization, retrosynthesis planning.
6/ Our Transformer Neural Decision Process (TNDP) unifies experimental design and decision-making in a single framework, allowing instant design proposals while maintaining high decision quality.
5/ We introduce Decision Utility Gain (DUG) to guide experimental design with a direct focus on optimizing decision-making tasks, moving beyond traditional information-theoretic objectives. DUG measures the improvement in the maximum expected utility from observing a new experimental design.
4/ In our work, we present a new amortized BED framework that optimizes experiments directly for downstream decision-making.
3/ But what if our goal goes beyond parameter inference? In many real-world tasks like medical diagnosis, we care more about making the right decisions than learning model parameters.
2/ Bayesian Experimental Design (BED) is a powerful framework to optimize experiments aimed at reducing uncertainty about unknown system parameters. Recent amortized BED methods use pre-trained neural networks for instant design proposals.
Optimizing decision utility in Bayesian experimental design is key to improving downstream decision-making.
Excited to share our #NeurIPS2024 paper on Amortized Decision-Aware Bayesian Experimental Design: arxiv.org/abs/2411.02064
@lacerbi.bsky.social @samikaski.bsky.social
Details below.
1/ Excuse me, can I interest you in eliciting your beliefs as flexible probability distributions? No worries, we only need pairwise comparisons or rankings, no personal details.
Led by **Petrus Mikkola** and joint with **Arto Klami**, to be presented soon at @neuripsconf.bsky.social #NeurIPS2024