Authors:
Year:
Target: Neural Network: Samples: Sequential:

Fast epsilon-free Inference of Simulation Models with Bayesian Conditional Density Estimation

May 20, 2016 George Papamakarios, Iain Murray
Target: posterior Neural Network: mixture-density Samples: direct Sequential: yes

Flexible statistical inference for mechanistic models of neural dynamics

November 6, 2017 Jan-Matthis Lueckmann, Pedro J. Gonçalves, Giacomo Bassetto, Kaan Öcal, Marcel Nonnenmacher, Jakob H. Macke
Target: posterior Neural Network: normalizing-flows Samples: direct Sequential: yes

Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

May 18, 2018 George Papamakarios, David C. Sterratt, Iain Murray
Target: likelihood Neural Network: normalizing-flows Samples: MCMC Sequential: yes

Likelihood-free inference with emulator networks

May 23, 2018 Jan-Matthis Lueckmann, Giacomo Bassetto, Theofanis Karaletsos, Jakob H. Macke
Target: likelihood Neural Network: deep-ensembles Samples: MCMC Sequential: yes

Likelihood-free MCMC with Amortized Approximate Ratio Estimators

March 10, 2019 Joeri Hermans, Volodimir Begy, Gilles Louppe
Target: likelihood-ratio Neural Network: classifier-NN Samples: MCMC Sequential: no

Automatic posterior transformation for likelihood-free inference

May 17, 2019 David S. Greenberg, Marcel Nonnenmacher, Jakob H. Macke
Target: posterior Neural Network: normalizing-flows Samples: direct Sequential: yes

The frontier of simulation-based inference

November 4, 2019 Kyle Cranmer, Johann Brehmer, Gilles Louppe
Review paper
  • They discuss recent advaces in LFI, including NN-based methods.

On Contrastive Learning for Likelihood-free Inference

February 10, 2020 Conor Durkan, Iain Murray, George Papamakarios
Target: likelihood-ratio, posterior Neural Network: classifier-NN, normalizing-flows Samples: MCMC, direct Sequential: yes
  • Code
  • They unify the ratio estimation approach by Hermans et al. (2019) with the sequential neural posterior formulation of Greenberg et al. (2019).

BayesFlow: Learning complex stochastic models with invertible neural networks

March 13, 2020 Stefan T. Radev, Ulf K. Mertens, Andreass Voss, Lynton Ardizzone, Ullrich Köthe
Target: posterior Neural Network: normalizing-flows Samples: direct Sequential: no

Distortion estimates for approximate Bayesian inference

June 19, 2020 Hanwen Xing, Geoff Nicholls, Jeong (Kate) Lee
Target: posterior Neural Network: normalizing-flows Samples: direct Sequential: no
  • They use normalizing flows to build distortion maps which acts on univariate marginals of the approximate posterior to move them closer to the exact posterior, without evaluation of the latter, for a specific observation.
  • Can be applied to any posterior approximation, provided simulations from the model are doable.

SBI – A toolkit for simulation-based inference

July 17, 2020 Alvaro Tejero-Cantero, Jan Boelts, Michael Deistler, Jan-Matthis Lueckmann, Conor Durkan, Pedro J. Gonçalves, David S. Greenberg, Jakob H. Macke
Python package
  • Code
  • A Python toolbox for simulation-based inference methods based on NNs.
  • Documentation at this webpage

Likelihood approximation networks (LANs) for fast inference of simulation models in cognitive neuroscience

December 2, 2020 Alexander Fengler, Lakshmi N Govindarajan, Tony Chen, Michael J Frank
Target: likelihood Neural Network: any-NN Samples: MCMC Sequential: no
  • Code
  • They use NNs to directly approximate likelihood estimates obtained by KDE or binned empirical estimates from repeated model simulations.

Score Matched Neural Exponential Families for Likelihood-Free Inference

December 20, 2020 Lorenzo Pacchiardi, Ritabrata Dutta
Target: likelihood Neural Network: any-NN Samples: MCMC Sequential: no

Benchmarking Simulation-Based Inference

January 12, 2021 Jan-Matthis Lueckmann, Jan Boelts, David S. Greenberg, Pedro J. Gonçalves, Jakob H. Macke
Benchmark repository
  • Code
  • Benchmark repository for LFI methods.
  • They test the performance of LFI methods (including NN-based ones) on a variety of models and data sets.
  • Interactive results at this webpage

Sequential Neural Posterior and Likelihood Approximation

February 12, 2021 Samuel Wiqvist, Jes Frellsen, Umberto Picchini
Target: likelihood, posterior Neural Network: normalizing-flows Samples: direct Sequential: yes
  • Code
  • They use normalizing flows for both the posterior and the likelihood.
  • They also optionally use an additional NN for learning summary statistics at the same time.

Robust and integrative Bayesian neural networks for likelihood-free parameter inference

February 12, 2021 Fredrik Wrede, Robin Eriksson, Richard Jiang, Linda Petzold, Stefan Engblom, Andreas Hellander, Prashant Singh
Target: posterior Neural Network: Bayesian-NN Samples: MCMC Sequential: yes
  • They bin the parameter values and use a classifier to predict the exact parameter value from data
  • The classifier is a BNN and is trained on a set of parameter-data pairs
  • When a new observation comes, the posterior is estimated by MCMC on the posterior weights and combining the set of predictions of the classifier into a (discretized) posterior distribution.

MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood Inference from Sampled Trajectories

June 3, 2021 Giulio Isacchini, Natanael Spisak, Armita Nourmohammad, Thierry Mora, Aleksandra M. Walczak
Target: likelihood-ratio Neural Network: any-NN Samples: MCMC Sequential: no

GATSBI: Generative Adversarial Training for Simulation-Based Inference

September 29, 2021 Poornima Ramesh, Jan-Matthis Lueckmann, Jan Boelts, Álvaro Tejero-Cantero, David S. Greenberg, Pedro J. Gonçalves, Jakob H. Macke
Target: posterior Neural Network: generative-NN Samples: direct Sequential: yes
  • Uses Generative Adversarial Networks to represent the posterior.
  • Code

Variational methods for simulation-based inference

September 29, 2021 Manuel Glöckler, Michael Deistler, Jakob H. Macke
Target: likelihood, posterior Neural Network: normalizing-flows Samples: direct Sequential: yes
  • Code
  • They consider the standard likelihood or likelihood-ratio approaches and replace MCMC with a variational inference done via normalizing flows
  • In this way you get benefits of targeting likelihood with no need of MCMC
  • Related to Wiqvist et al. (2021) above.

Arbitrary Marginal Neural Ratio Estimation for Simulation-based Inference

October 1, 2021 François Rozet, Gilles Louppe
Target: likelihood-ratio Neural Network: classifier-NN Samples: MCMC Sequential: no
  • Extension of Hermans et al. 2019 (above) which allows to explicitly obtain every possible marginal
  • Thanks to this you can plot marginals with no need of posterior sampling
  • Code

Variational Likelihood-Free Gradient Descent

November 22, 2021 Jack Simons, Song Liu, Mark Beaumont
Target: likelihood-ratio Neural Network: classifier-NN Samples: Stein Variational Gradient Descent Sequential: yes
  • Considers a set of particles which will be a posterior approximation
  • The particles are fitted to the posterior by alternating between NN training and Stein Variaional Gradient Descent steps

Group equivariant neural posterior estimation

November 25, 2021 Maximilian Dax, Stephen R. Green, Jonathan Gair, Michael Deistler, Bernhard Schölkopf, Jakob H. Macke
Target: posterior Neural Network: normalizing-flows Samples: MCMC Sequential: no
  • Extension of the Neural Posterior Estimation approach with normalizing flows to incorporate equivariant transformations.
  • The methods works without changing the NN architecture, instead transforming the data via another transformation before feeding it into the NN. The transformation is also parametrized by a NN.
  • It defines a joint posterior on parameters and transformation and then obtain posterior samples via Gibbs sampling on the two conditionals (each conditional sampling steps do not require MCMC, as normalizing flows are used for the conditional on parameters and blurring via a kernel is used for the transformation conditional).

BayesFlow Can Reliably Detect Model Misspecification and Posterior Errors in Amortized Bayesian Inference

December 16, 2021 Marvin Schmitt, Paul-Christian Bürkner, Ullrich Köthe, Stefan T. Radev
Target: posterior Neural Network: normalizing-flows Samples: direct Sequential: no
  • Code
  • They modify the original BayesFlow approach (Radev et al. 2020, see above) to somehow handle and diagnose model misspecification, using MMD.

Conditional Simulation Using Diffusion Schrödinger Bridges

February 27, 2022 Yuyang Shi, Valentin De Bortoli, George Deligiannidis, Arnaud Doucet
Target: posterior Neural Network: denoising-diffusion-model Samples: direct Sequential: no
  • They use denoising diffusion models on an extended state space which allows for Likelihood-Free Bayesian Inference