pith. machine review for the scientific record. sign in

arxiv: 2512.23748 · v2 · submitted 2025-12-26 · 💻 cs.LG · math.PR· stat.ML

Recognition: no theorem link

A Review of Diffusion-based Simulation-Based Inference: Foundations and Applications in Non-Ideal Data Scenarios

Authors on Pith no claims yet

Pith reviewed 2026-05-16 18:55 UTC · model grok-4.3

classification 💻 cs.LG math.PRstat.ML
keywords diffusion modelssimulation-based inferencemodel misspecificationmissing dataunstructured observationsposterior estimationlikelihood-free inferencegeophysical uncertainty quantification
0
0 comments X

The pith

Diffusion models enable accurate posterior inference from simulators despite model misspecification, irregular observations, and missing data.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This review synthesizes the foundations of diffusion-based simulation-based inference and shows how these methods address limitations of earlier neural approaches like normalizing flows when dealing with intractable likelihoods. It surveys eight specific techniques that adapt diffusion processes to handle three common non-ideal scenarios in scientific computing: simulator-reality mismatch, unstructured or infinite-dimensional observations, and incomplete data. A reader would care because these challenges routinely block reliable parameter estimation in domains such as geophysical modeling, where classical likelihood methods fail outright. The paper maintains consistent notation while stressing the conditions under which the learned posteriors remain accurate.

Core claim

Diffusion models learn posteriors directly from simulator outputs by reversing a noising process, and variants such as conditional diffusion for irregular data, guided diffusion for prior adaptation, sequential factorized methods for efficiency, and consistency models for fast sampling maintain accurate posteriors under model misspecification, unstructured observations, and missing data.

What carries the argument

Conditional and guided diffusion processes that generate posterior samples while incorporating irregular or incomplete observations and adapting to simulator-reality gaps.

If this is right

  • Posteriors remain reliable even when the simulator systematically differs from the real system.
  • Observations need not lie on a regular grid or have fixed dimension for inference to proceed.
  • Sequential and factorized diffusion variants reduce the number of simulator calls required.
  • Consistency models allow posterior sampling in far fewer steps than standard diffusion.
  • These techniques support uncertainty quantification in geophysical applications where all three non-ideal conditions appear together.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same diffusion machinery could be tested on high-dimensional climate or biological simulators that share similar data irregularities.
  • Combining guided diffusion with existing amortized inference pipelines might further cut computational cost without retraining from scratch.
  • Explicit checks for posterior calibration on synthetic misspecification benchmarks would strengthen claims of robustness beyond the geophysical examples.

Load-bearing premise

The eight surveyed diffusion methods preserve accurate posterior distributions when the simulator does not match reality or when observations are incomplete or irregularly structured.

What would settle it

Empirical results on a controlled misspecified simulator with deliberately missing data points where posterior samples from one of the reviewed methods show statistically significant deviation from ground-truth posteriors obtained by exhaustive sampling.

read the original abstract

For complex simulation problems, inferring parameters often precludes the use of classical likelihood-based techniques due to intractable likelihoods. Simulation-based inference (SBI) methods offer a likelihood-free approach to directly learn posterior distributions $p(\bftheta \mid \xobs)$ from simulator outputs. Recently, diffusion models have emerged as promising tools for SBI, addressing limitations of earlier neural methods such as neural likelihood/posterior estimation and normalizing flows. This review examines diffusion-based SBI from first principles to applications, emphasizing robustness in three non-ideal data scenarios common to scientific computing: model misspecification (simulator-reality mismatch), unstructured or infinite-dimensional observations, and missing data. We synthesize mathematical foundations and survey eight methods addressing these challenges, such as conditional diffusion for irregular data, guided diffusion for prior adaptation, sequential and factorized approaches for efficiency, and consistency models for fast sampling. Throughout, we maintain consistent notation and emphasize conditions required for accurate posteriors. We conclude with open problems and applications to geophysical uncertainty quantification, where these challenges are acute.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

0 major / 2 minor

Summary. The manuscript is a review paper that synthesizes diffusion-based simulation-based inference (SBI) methods from first principles, surveying eight existing approaches for robustness in three non-ideal scenarios: model misspecification, unstructured or infinite-dimensional observations, and missing data. It maintains consistent notation across foundations and applications, restates conditions for accurate posteriors from the literature, and concludes with open problems plus an application to geophysical uncertainty quantification.

Significance. If the synthesis accurately captures the surveyed methods, the review provides a useful organizational framework for an emerging area, highlighting how diffusion models address limitations of neural likelihood estimation and normalizing flows in SBI. The emphasis on conditions for posterior accuracy and consistent notation strengthens its utility as a reference for practitioners in scientific computing.

minor comments (2)
  1. [Abstract] The abstract states that eight methods are surveyed but does not name them; adding a short enumerated list would improve immediate accessibility for readers.
  2. [§3.2] In the section on sequential and factorized approaches, the efficiency claims would benefit from explicit cross-references to the corresponding equations in the foundations section.

Simulated Author's Rebuttal

0 responses · 0 unresolved

We thank the referee for their positive evaluation of the manuscript and for recommending acceptance. We are pleased that the synthesis is viewed as providing a useful organizational framework with consistent notation and emphasis on conditions for posterior accuracy.

Circularity Check

0 steps flagged

No significant circularity: review synthesizes external literature without new derivations

full rationale

The manuscript is a review paper whose central contribution is organizational synthesis of eight existing diffusion-based SBI methods for non-ideal data scenarios. It introduces no new theorems, derivations, equations, or empirical results, instead restating mathematical foundations and conditions from prior literature while maintaining consistent notation. No load-bearing steps reduce by construction to self-definitions, fitted parameters renamed as predictions, or self-citation chains, as all claims trace to independently published external works. The paper is therefore self-contained against external benchmarks with no internal circularity.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

This is a review paper that does not introduce new free parameters, axioms, or invented entities but relies on the prior literature it surveys for foundations of diffusion models and SBI.

axioms (1)
  • standard math Foundations of diffusion models and simulation-based inference
    The review builds upon established principles in these areas as described in the abstract.

pith-pipeline@v0.9.0 · 5484 in / 1125 out tokens · 47727 ms · 2026-05-16T18:55:04.364551+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

93 extracted references · 93 canonical work pages · 13 internal anchors

  1. [1]

    Groundwater hydrology/Modelling approaches (2022)

    Hull, R., Leonarduzzi, E., De La Fuente, L., Tran, H.V., Bennett, A., Melchior, P., Maxwell, R.M., Condon, L.E.: Using simulation-based inference to determine the parameters of an integrated hydrologic model: a case study from the upper Colorado River basin. Groundwater hydrology/Modelling approaches (2022). https://doi.org/10.5194/hess-2022-345 . https:/...

  2. [2]

    Gloeckler, M., Deistler, M., Weilbach, C., Wood, F., Macke, J.H.: All-in- one simulation-based inference. arXiv. arXiv:2404.09636 [cs] (2024). https:// doi.org/10.48550/arXiv.2404.09636 . http://arxiv.org/abs/2404.09636 Accessed 2025-07-31

  3. [3]

    IEEE Transactions on Neural Networks and Learning Systems33(4), 1452–1466 (2022) https://doi.org/10.1109/TNNLS.2020.3042395

    Radev, S.T., Mertens, U.K., Voss, A., Ardizzone, L., K¨ othe, U.: BayesFlow: Learning Complex Stochastic Models With Invertible Neural Networks. IEEE Transactions on Neural Networks and Learning Systems33(4), 1452–1466 (2022) https://doi.org/10.1109/TNNLS.2020.3042395 . Accessed 2025-05-19

  4. [4]

    Rutherford, M

    Wang, B., Leja, J., Villar, V.A., Speagle, J.S.: Sbi++: Flexible, ultra-fast likelihood-free inference customized for astronomical applications. The Astro- physical Journal Letters952(1), 10 (2023) https://doi.org/10.3847/2041-8213/ ace361

  5. [5]

    Monthly Notices of the Royal Astronomical Society488(3), 4440–4458 (2019) https://doi.org/10.1093/mnras/stz1960

    Alsing, J., Wandelt, B., Feeney, S.: Fast likelihood-free cosmology with neural den- sity estimators and active learning. Monthly Notices of the Royal Astronomical Society488(3), 4440–4458 (2019) https://doi.org/10.1093/mnras/stz1960

  6. [6]

    Proceedings of the National Academy of Sciences117(48) (2020) https://doi.org/ 10.1073/pnas.1912789117

    Cranmer, K., Brehmer, J., Louppe, G.: The frontier of simulation-based inference. Proceedings of the National Academy of Sciences117(48) (2020) https://doi.org/ 10.1073/pnas.1912789117 . Accessed 2025-04-21

  7. [7]

    Andry, G.: Data assimilation as simulation-based inference

  8. [8]

    In: Proceedings of the 36th International Con- ference on Machine Learning, pp

    Greenberg, D., Nonnenmacher, M., Macke, J.: Automatic Posterior Transforma- tion for Likelihood-Free Inference. In: Proceedings of the 36th International Con- ference on Machine Learning, pp. 2404–2414. PMLR, ??? (2019). ISSN: 2640-3498. https://proceedings.mlr.press/v97/greenberg19a.htmlAccessed 2025-05-15

  9. [9]

    https://arxiv.org/html/2408.15136v1 Accessed 2025-06-06

    Delaunoy, A.: Low-Budget Simulation-Based Inference with Bayesian Neural Networks. https://arxiv.org/html/2408.15136v1 Accessed 2025-06-06

  10. [10]

    In: Proceedings of the 24th International Conference on Artificial Intelligence and Statistics, pp

    Lueckmann, J.-M., Boelts, J., Greenberg, D., Goncalves, P., Macke, J.: Bench- marking simulation-based inference. In: Proceedings of the 24th International Conference on Artificial Intelligence and Statistics, pp. 343–351. PMLR, ??? (2021).https://proceedings.mlr.press/v130/lueckmann21a.html 60

  11. [11]

    eLife11, 77220 (2022) https://doi.org/10.7554/eLife.77220

    Boelts, J., Lueckmann, J.-M., Gao, R., Macke, J.H.: Flexible and efficient simulation-based inference for models of decision-making. eLife11, 77220 (2022) https://doi.org/10.7554/eLife.77220 . Publisher: eLife Sciences Publications, Ltd. Accessed 2025-08-18

  12. [12]

    Ward, D., Cannon, P., Beaumont, M., Fasiolo, M., Schmon, S.M.: Robust Neural Posterior Estimation and Statistical Model Criticism. arXiv. arXiv:2210.06564 [stat] (2022). https://doi.org/10.48550/arXiv.2210.06564 . http://arxiv.org/abs/ 2210.06564 Accessed 2025-08-18

  13. [13]

    https://arxiv.org/abs/2310.13402

    Falkiewicz, M., Takeishi, N., Shekhzadeh, I., Wehenkel, A., Delaunoy, A., Louppe, G., Kalousis, A.: Calibrating Neural Simulation-Based Inference with Differentiable Coverage Probability (2023). https://arxiv.org/abs/2310.13402

  14. [14]

    https://arxiv.org/abs/2406.03154

    Schmitt, M., B¨ urkner, P.-C., K¨ othe, U., Radev, S.T.: Detecting Model Misspec- ification in Amortized Bayesian Inference with Neural Networks: An Extended Investigation (2024). https://arxiv.org/abs/2406.03154

  15. [15]

    Papamakarios, G., Sterratt, D.C., Murray, I.: Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows. arXiv. arXiv:1805.07226 [stat] (2019). https://doi.org/10.48550/arXiv.1805.07226 . http://arxiv.org/abs/ 1805.07226 Accessed 2025-08-07

  16. [16]

    Hermans, J., Delaunoy, A., Rozet, F., Wehenkel, A., Begy, V., Louppe, G.: A Trust Crisis In Simulation-Based Inference? Your Posterior Approximations Can Be Unfaithful. arXiv. arXiv:2110.06581 [stat] (2022). https://doi.org/10.48550/ arXiv.2110.06581 . http://arxiv.org/abs/2110.06581 Accessed 2025-08-18

  17. [17]

    Papamakarios, G., Murray, I.: Fast Epsilon-free Inference of Simulation Models with Bayesian Conditional Density Estimation. arXiv. arXiv:1605.06376 [stat] (2018). https://doi.org/10.48550/arXiv.1605.06376 . http://arxiv.org/abs/1605. 06376 Accessed 2025-08-18

  18. [18]

    In: Proceedings of the Thir- teenth International Conference on Learning Representations (2025)

    Verma, Y., Bharti, A., Garg, V.: Robust simulation-based inference under missing data via neural processes. In: Proceedings of the Thir- teenth International Conference on Learning Representations (2025). https://openreview.net/forum?id=GsR3zRCRX5

  19. [19]

    Likelihood-free inference with emulator networks

    Lueckmann, J.-M., Bassetto, G., Karaletsos, T., Macke, J.H.: Likelihood-free inference with emulator networks (2019). https://arxiv.org/abs/1805.09294

  20. [20]

    In: Proceedings of the 37th International Conference on Machine Learning, pp

    Durkan, C., Murray, I., Papamakarios, G.: On contrastive learning for likelihood-free inference. In: Proceedings of the 37th International Conference on Machine Learning, pp. 2771–2781. PMLR, ??? (2020). https://proceedings.mlr.press/v119/durkan20a.html 61

  21. [21]

    In: Proceedings of the 37th Interna- tional Conference on Machine Learning, pp

    Hermans, J., Begy, V., Louppe, G.: Likelihood-free mcmc with amor- tized approximate ratio estimators. In: Proceedings of the 37th Interna- tional Conference on Machine Learning, pp. 4239–4248. PMLR, ??? (2020). https://proceedings.mlr.press/v119/hermans20a.html

  22. [23]

    OpenReview (2021)

    Ramesh, P., Lueckmann, J.-M., Boelts, J., Tejero-Cantero, ´A., Greenberg, D.S., Gon¸ calves, P.J., Macke, J.H.: GATSBI: Generative Adversarial Training for Simulation-Based Inference. OpenReview (2021). https://openreview.net/forum? id=kR1hC6j48Tp Accessed 2025-08-18

  23. [24]

    Deistler, M., Goncalves, P.J., Macke, J.H.: Truncated proposals for scal- able and hassle-free simulation-based inference. arXiv. arXiv:2210.04815 [stat] (2022). https://doi.org/10.48550/arXiv.2210.04815 . http://arxiv.org/abs/2210. 04815 Accessed 2025-08-18

  24. [25]

    In: Advances in Approximate Bayesian Inference (AABI 2023) (2023)

    Glaser, P., Arbel, M., Doucet, A., Gretton, A.: Maximum likeli- hood learning of energy-based models for simulation-based inference. In: Advances in Approximate Bayesian Inference (AABI 2023) (2023). https://openreview.net/forum?id=gL68u5UuWa

  25. [26]

    Sharrock, L., Simons, J., Liu, S., Beaumont, M.: Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models. arXiv. arXiv:2210.04872 [stat] (2024). https://doi.org/10.48550/arXiv. 2210.04872 . http://arxiv.org/abs/2210.04872 Accessed 2025-08-18

  26. [27]

    Liu, N., Li, S., Du, Y., Torralba, A., Tenenbaum, J.B.: Compositional Visual Generation with Composable Diffusion Models. arXiv. arXiv:2206.01714 [cs] (2023). https://doi.org/10.48550/arXiv.2206.01714 . http://arxiv.org/abs/2206. 01714 Accessed 2025-08-20

  27. [28]

    In: Advances in Neural Information Processing Systems, vol

    Tashiro, Y., Song, J., Song, Y., Ermon, S.: Csdi: Conditional score-based diffusion models for probabilistic time series imputation. In: Advances in Neural Information Processing Systems, vol. 34, pp. 24804–24816 (2021). https://proceedings.neurips.cc/paper/2021/hash/cfe8504bda37b575c70ee1a8276f3486- Abstract.html

  28. [29]

    Deep Unsupervised Learning using Nonequilibrium Thermodynamics

    Sohl-Dickstein, J., Weiss, E.A., Maheswaranathan, N., Ganguli, S.: Deep Unsu- pervised Learning using Nonequilibrium Thermodynamics (2015). https://arxiv. org/abs/1503.03585

  29. [30]

    In: Advances in Neural Information Processing Sys- tems, vol

    Ho, J., Jain, A., Abbeel, P.: Denoising Diffusion Probabilistic Models. In: Advances in Neural Information Processing Sys- tems, vol. 33, pp. 6840–6851. Curran Associates, Inc., ??? (2020). 62 https://proceedings.neurips.cc/paper/2020/hash/4c5bcfec8584af0d967f1ab10179ca4b- Abstract.htmlAccessed 2025-08-20

  30. [31]

    Song, Y., Ermon, S.: Generative Modeling by Estimating Gradients of the Data Distribution. arXiv. arXiv:1907.05600 [cs] (2020). https://doi.org/10.48550/ arXiv.1907.05600 . http://arxiv.org/abs/1907.05600 Accessed 2025-08-20

  31. [32]

    Stochastic Pro- cesses and their Applications12(3), 313–326 (1982) https://doi.org/10.1016/ 0304-4149(82)90051-5

    Anderson, B.D.O.: Reverse-time diffusion equation models. Stochastic Pro- cesses and their Applications12(3), 313–326 (1982) https://doi.org/10.1016/ 0304-4149(82)90051-5 . Accessed 2025-08-26

  32. [33]

    Journal of Machine Learning Research6(24), 695–709 (2005)

    Hyv¨ arinen, A.: Estimation of non-normalized statistical models by score match- ing. Journal of Machine Learning Research6(24), 695–709 (2005)

  33. [34]

    2015, TensorFlow: Large-Scale Ma- chine Learning on Heterogeneous Systems, software available from tensor- flow.org Antonelli, V ., Pietschner, D., Strecker, R., et al

    Deistler, M., Boelts, J., Steinbach, P., Moss, G., Moreau, T., Gloeckler, M., Rodrigues, P.L.C., Linhart, J., Lappalainen, J.K., Miller, B.K., Gon¸ calves, P.J., Lueckmann, J.-M., Schr¨ oder, C., Macke, J.H.: Simulation-Based Inference: A Practical Guide (2025). https://arxiv.org/abs/2508.12939

  34. [35]

    Kelly, R.P., Nott, D.J., Frazier, D.T., Warne, D.J., Drovandi, C.: Misspecification- robust Sequential Neural Likelihood for Simulation-based Inference. arXiv. arXiv:2301.13368 [stat] version: 2 (2024). https://doi.org/10.48550/arXiv.2301. 13368 . http://arxiv.org/abs/2301.13368 Accessed 2025-08-18

  35. [36]

    Kelly, R.P., Warne, D.J., Frazier, D.T., Nott, D.J., Gutmann, M.U., Drovandi, C.: Simulation-based Bayesian inference under model misspecification. arXiv. arXiv:2503.12315 [stat] (2025). https://doi.org/10.48550/arXiv.2503.12315 . http: //arxiv.org/abs/2503.12315 Accessed 2025-08-18

  36. [37]

    Chen, T., Bansal, V., Scott, J.G.: Conditional diffusions for amortized neural posterior estimation. arXiv. arXiv:2410.19105 [stat] (2025). https://doi.org/10. 48550/arXiv.2410.19105 . http://arxiv.org/abs/2410.19105 Accessed 2025-08-20

  37. [38]

    Advances in Neural Information Processing Systems36, 24262–24290 (2023)

    Baldassari, L., Siahkoohi, A., Garnier, J., Solna, K., Hoop, M.V.: Condi- tional score-based diffusion models for Bayesian inference in infinite dimensions. Advances in Neural Information Processing Systems36, 24262–24290 (2023). Accessed 2025-08-20

  38. [39]

    Dax, M., Wildberger, J., Buchholz, S., Green, S.R., Macke, J.H., Sch¨ olkopf, B.: Flow Matching for Scalable Simulation-Based Inference. arXiv. arXiv:2305.17161 [cs] (2023). https://doi.org/10.48550/arXiv.2305.17161 . http://arxiv.org/abs/ 2305.17161 Accessed 2025-08-18

  39. [40]

    https://arxiv.org/abs/2209.14249 63

    Geffner, T., Papamakarios, G., Mnih, A.: Compositional Score Modeling for Simulation-based Inference (2023). https://arxiv.org/abs/2209.14249 63

  40. [41]

    https://arxiv.org/ abs/2312.05440

    Schmitt, M., Pratz, V., K¨ othe, U., B¨ urkner, P.-C., Radev, S.T.: Consistency Mod- els for Scalable and Fast Simulation-Based Inference (2024). https://arxiv.org/ abs/2312.05440

  41. [42]

    Elsem¨ uller, L., Olischl¨ ager, H., Schmitt, M., B¨ urkner, P.-C., K¨ othe, U., Radev, S.T.: Sensitivity-Aware Amortized Bayesian Inference. arXiv. arXiv:2310.11122 [stat] (2024). https://doi.org/10.48550/arXiv.2310.11122 . http://arxiv.org/abs/ 2310.11122 Accessed 2025-08-18

  42. [43]

    https://arxiv.org/abs/2410.15320

    Chang, P.E., Loka, N., Huang, D., Remes, U., Kaski, S., Acerbi, L.: Amortized Probabilistic Conditioning for Optimization, Simulation and Inference (2025). https://arxiv.org/abs/2410.15320

  43. [44]

    https://arxiv.org/abs/2502.02463

    Whittle, G., Ziomek, J., Rawling, J., Osborne, M.A.: Distribution Transformers: Fast Approximate Bayesian Inference With On-The-Fly Prior Adaptation (2025). https://arxiv.org/abs/2502.02463

  44. [45]

    Zaheer, M., Kottur, S., Ravanbakhsh, S., Poczos, B., Salakhutdinov, R., Smola, A.: Deep Sets. arXiv. arXiv:1703.06114 [cs] (2018). https://doi.org/10.48550/ arXiv.1703.06114 . http://arxiv.org/abs/1703.06114 Accessed 2025-08-05

  45. [46]

    https://arxiv.org/ abs/2106.01357

    Bortoli, V.D., Thornton, J., Heng, J., Doucet, A.: Diffusion Schr¨ odinger Bridge with Applications to Score-Based Generative Modeling (2023). https://arxiv.org/ abs/2106.01357

  46. [47]

    In: Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, pp

    Shi, Y., Bortoli, V.D., Deligiannidis, G., Doucet, A.: Conditional simulation using diffusion Schr¨ odinger bridges. In: Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, pp. 1792–1802. PMLR, ??? (2022). ISSN: 2640-3498.https://proceedings.mlr.press/v180/shi22a.htmlAccessed 2025-08-20

  47. [48]

    Song, Y., Sohl-Dickstein, J., Kingma, D.P., Kumar, A., Ermon, S., Poole, B.: Score-Based Generative Modeling through Stochastic Differential Equations. arXiv. arXiv:2011.13456 [cs] (2021). https://doi.org/10.48550/arXiv.2011.13456 . http://arxiv.org/abs/2011.13456 Accessed 2025-08-20

  48. [49]

    Elucidating the Design Space of Diffusion-Based Generative Models

    Karras, T., Aittala, M., Aila, T., Laine, S.: Elucidating the Design Space of Diffusion-Based Generative Models (2022). https://arxiv.org/abs/2206.00364

  49. [50]

    In: Pro- ceedings of the Twelfth International Conference on Learning Representations (2024).https://openreview.net/forum?id=A VkJEb1ahOY

    Simons, J., Sharrock, L., Liu, S., Beaumont, M.: Neural score estimation: Likelihood-free inference with conditional score based diffusion models. In: Pro- ceedings of the Twelfth International Conference on Learning Representations (2024).https://openreview.net/forum?id=A VkJEb1ahOY

  50. [51]

    In: Frontiers in Probabilistic Inference Workshop at ICLR (2025)

    Chang, P.E., Rissanen, S., Loka, N., Huang, D.: Inference-time prior adaptation in simulation-based inference via guided diffusion mod- els. In: Frontiers in Probabilistic Inference Workshop at ICLR (2025). https://openreview.net/forum?id=WMOsDItRu4 64

  51. [52]

    CoRR abs/2201.12204(2022) 2201.12204

    Dupont, E., Kim, H., Eslami, S.M.A., Rezende, D.J., Rosenbaum, D.: From data to functa: Your data point is a function and you should treat it like one. CoRR abs/2201.12204(2022) 2201.12204

  52. [53]

    https://arxiv.org/abs/2209.14125

    Phillips, A., Seror, T., Hutchinson, M., Bortoli, V.D., Doucet, A., Mathieu, E.: Spectral Diffusion Processes (2022). https://arxiv.org/abs/2209.14125

  53. [54]

    https://arxiv.org/abs/2212.00886

    Kerrigan, G., Ley, J., Smyth, P.: Diffusion Generative Models in Infinite Dimen- sions (2023). https://arxiv.org/abs/2212.00886

  54. [55]

    Batzolis, G., Stanczuk, J., Sch¨ onlieb, C.-B., Etmann, C.: Conditional Image Generation with Score-Based Diffusion Models. arXiv. arXiv:2111.13606 [cs] (2021). https://doi.org/10.48550/arXiv.2111.13606 . http://arxiv.org/abs/2111. 13606 Accessed 2025-08-20

  55. [56]

    https://arxiv.org/abs/2302.10130

    Pidstrigach, J., Marzouk, Y., Reich, S., Wang, S.: Infinite-Dimensional Diffusion Models (2025). https://arxiv.org/abs/2302.10130

  56. [57]

    https://arxiv.org/abs/2405.08719

    Wehenkel, A., Gamella, J.L., Sener, O., Behrmann, J., Sapiro, G., Jacobsen, J.-H., Cuturi, M.: Addressing Misspecification in Simulation-based Inference through Data-driven Calibration (2025). https://arxiv.org/abs/2405.08719

  57. [58]

    PLOS Computational Biology20(6), 1012184 (2024) https://doi.org/10.1371/journal.pcbi.1012184

    Wang, Z., Hasenauer, J., Sch¨ alte, Y.: Missing data in amortized simulation- based neural posterior estimation. PLOS Computational Biology20(6), 1012184 (2024) https://doi.org/10.1371/journal.pcbi.1012184 . Publisher: Public Library of Science. Accessed 2025-08-18

  58. [59]

    Reliability Engi- neering & System Safety54(2), 127–132 (1996) https://doi.org/10.1016/ S0951-8320(96)00070-1

    Winkler, R.L.: Uncertainty in probabilistic risk assessment. Reliability Engi- neering & System Safety54(2), 127–132 (1996) https://doi.org/10.1016/ S0951-8320(96)00070-1 . Treatment of Aleatory and Epistemic Uncertainty

  59. [60]

    Huang, D., Bharti, A., Holanda De Souza Junior, A., Acerbi, L., Kaski, S.: Learning Robust Statistics for Simulation-based Inference under Model Misspec- ification, 1–22 (2024)

  60. [61]

    Schmitt, M., Odole, L., Radev, S.T., B¨ urkner, P.-C.: Fuse It or Lose It: Deep Fusion for Multimodal Simulation-Based Inference. arXiv. arXiv:2311.10671 [cs] (2024). https://doi.org/10.48550/arXiv.2311.10671 . http://arxiv.org/abs/2311. 10671 Accessed 2025-08-18

  61. [62]

    Journal of Computational and Graphical Statistics27(1), 1–11 (2018)

    Price, L.F., Drovandi, C.C., Lee, A., Nott, D.J.: Bayesian synthetic likelihood. Journal of Computational and Graphical Statistics27(1), 1–11 (2018)

  62. [63]

    https://arxiv.org/abs/ 1902.04827 65

    Frazier, D.T., Nott, D.J., Drovandi, C., Kohn, R.: Bayesian inference using syn- thetic likelihood: asymptotics and adjustments (2021). https://arxiv.org/abs/ 1902.04827 65

  63. [64]

    https://arxiv.org/abs/2110.00449

    Rozet, F., Louppe, G.: Arbitrary Marginal Neural Ratio Estimation for Simulation-based Inference (2021). https://arxiv.org/abs/2110.00449

  64. [65]

    Journal of Glaciology 71, 44 (2025) https://doi.org/10.1017/jog.2025.13

    Moss, G., Viˇ snjevi´ c, V., Eisen, O., Oraschewski, F.M., Schr¨ oder, C., Macke, J.H., Drews, R.: Simulation-Based Inference of Surface Accumulation and Basal Melt Rates of an Antarctic Ice Shelf from Isochronal Layers. Journal of Glaciology 71, 44 (2025) https://doi.org/10.1017/jog.2025.13 . arXiv:2312.02997 [physics]. Accessed 2025-08-18

  65. [66]

    Neuroscience (2024)

    Manzano-Patron, J.P., Deistler, M., Schr¨ oder, C., Kypraios, T., Gon¸ calves, P.J., Macke, J.H., Sotiropoulos, S.N.: Uncertainty mapping and probabilistic tractog- raphy using Simulation-based Inference in diffusion MRI: A comparison with classical Bayes. Neuroscience (2024). https://doi.org/10.1101/2024.11.19.624267 . http://biorxiv.org/lookup/doi/10.11...

  66. [67]

    PhD thesis, University of Bristol (2024)

    Simons, J.: Simulation-based inference with modern generative modelling. PhD thesis, University of Bristol (2024). https://research-information.bris.ac.uk/en/ studentTheses/simulation-based-inference-with-modern-generative-modelling

  67. [68]

    Neural computation23(7), 1661–1674 (2011)

    Vincent, P.: A connection between score matching and denoising autoencoders. Neural computation23(7), 1661–1674 (2011)

  68. [69]

    Improved Denoising Diffusion Probabilistic Models

    Nichol, A., Dhariwal, P.: Improved Denoising Diffusion Probabilistic Models (2021). https://arxiv.org/abs/2102.09672

  69. [70]

    Classifier-Free Diffusion Guidance

    Ho, J., Salimans, T.: Classifier-Free Diffusion Guidance (2022). https://arxiv.org/ abs/2207.12598

  70. [71]

    Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting.CoRR, abs/2101.12072, 2021

    Rasul, K., Seward, C., Schuster, I., Vollgraf, R.: Autoregressive Denoising Dif- fusion Models for Multivariate Probabilistic Time Series Forecasting (2021). https://arxiv.org/abs/2101.12072

  71. [72]

    In: Proceedings of the Eleventh International Conference on Learning Representations (2023)

    Song, J., Vahdat, A., Mardani, M., Kautz, J.: Pseudoinverse- guided diffusion models for inverse problems. In: Proceedings of the Eleventh International Conference on Learning Representations (2023). https://openreview.net/forum?id=9 gsMA8MRKQ

  72. [73]

    Rozet, F., Louppe, G.: Score-based Data Assimilation. arXiv. arXiv:2306.10574 [cs] (2023). https://doi.org/10.48550/arXiv.2306.10574 . http://arxiv.org/abs/ 2306.10574 Accessed 2025-08-18

  73. [74]

    International Journal of Innovative Technology and Interdisciplinary Sciences5(3), 971–1005 (2022) https://doi.org/10.15157/ IJITIS.2022.5.3.971-1005

    Joel, L.O., Doorsamy, W., Paul, B.S.: A Review of Missing Data Handling Tech- niques for Machine Learning. International Journal of Innovative Technology and Interdisciplinary Sciences5(3), 971–1005 (2022) https://doi.org/10.15157/ IJITIS.2022.5.3.971-1005 . Accessed 2025-08-18

  74. [75]

    Guilford Press, ??? (2007)

    McKnight, P.E., McKnight, K.M., Sidani, S., Figueredo, A.J.: Missing 66 Data: A Gentle Introduction. Guilford Press, ??? (2007). Google-Books-ID: Oel21pwDWXQC

  75. [76]

    Energy298(C) (2024)

    Liu, Y., Wang, R., Gu, Y., Li, C., Wang, G.: Physics-inspired and data-driven two-stage deep learning approach for wind field reconstruction with experimental validation. Energy298(C) (2024). Publisher: Elsevier. Accessed 2025-06-06

  76. [77]

    Shukla, S.N., Marlin, B.M.: A Survey on Principles, Models and Methods for Learning from Irregularly Sampled Time Series. arXiv. arXiv:2012.00168 [cs] (2021). https://doi.org/10.48550/arXiv.2012.00168 . http://arxiv.org/abs/2012. 00168 Accessed 2025-08-18

  77. [78]

    W., Gorp, H

    Stevens, T.S.W., Nolan, O., Robert, J.-L., Van Sloun, R.J.G.: Sequential Poste- rior Sampling with Diffusion Models. In: ICASSP 2025 - 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1–5 (2025). https://doi.org/10.1109/ICASSP49660.2025.10889752 . ISSN: 2379-190X. https://ieeexplore.ieee.org/document/10889752/Acc...

  78. [79]

    https: //arxiv.org/abs/2310.08337

    Bartosh, G., Vetrov, D., Naesseth, C.A.: Neural Diffusion Models (2024). https: //arxiv.org/abs/2310.08337

  79. [80]

    In: Proceedings of the 40th International Conference on Machine Learning, pp

    Weilbach, C.D., Harvey, W., Wood, F.: Graphically Structured Diffu- sion Models. In: Proceedings of the 40th International Conference on Machine Learning, pp. 36887–36909. PMLR, ??? (2023). ISSN: 2640-3498. https://proceedings.mlr.press/v202/weilbach23a.htmlAccessed 2025-08-20

  80. [81]

    Nautiyal, M., Hellander, A., Singh, P.: ConDiSim: Conditional Diffusion Mod- els for Simulation Based Inference. arXiv. arXiv:2505.08403 [cs] version: 1 (2025). https://doi.org/10.48550/arXiv.2505.08403 . http://arxiv.org/abs/2505. 08403 Accessed 2025-08-20

Showing first 80 references.