pith. machine review for the scientific record. sign in

arxiv: 2604.07416 · v1 · submitted 2026-04-08 · 💻 cs.LG · cond-mat.mtrl-sci· physics.comp-ph

Recognition: no theorem link

Bayesian Optimization for Mixed-Variable Problems in the Natural Sciences

Authors on Pith no claims yet

Pith reviewed 2026-05-10 18:20 UTC · model grok-4.3

classification 💻 cs.LG cond-mat.mtrl-sciphysics.comp-ph
keywords Bayesian optimizationmixed-variable optimizationprobabilistic reparameterizationGaussian processesblack-box optimizationscientific applicationsgradient-based acquisition
0
0 comments X

The pith

Generalizing probabilistic reparameterization enables gradient-based Bayesian optimization in fully mixed continuous-discrete spaces.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper aims to make Bayesian optimization work efficiently for expensive black-box problems that mix continuous parameters with discrete ones of unequal spacing, a common setup in natural science experiments. It does this by extending probabilistic reparameterization so that the acquisition function can still be optimized with gradients even when discrete choices are not evenly spaced. This matters because standard methods lose sample efficiency or become computationally heavy once gradients disappear in mixed or high-cardinality spaces. The authors support the claim with benchmarks on both synthetic functions and actual experimental objectives, showing the approach remains stable under noise and limited data.

Core claim

By generalizing the probabilistic reparameterization approach of Daulton et al. to non-equidistant discrete variables, gradient-based optimization of the acquisition function becomes possible in fully mixed-variable Bayesian optimization with Gaussian process surrogates. Systematic benchmarks on synthetic and experimental objectives confirm robustness, and the method further enables efficient search over highly discontinuous and discretized landscapes when paired with a modified workflow.

What carries the argument

Generalized probabilistic reparameterization for non-equidistant discrete variables, which relaxes the mixed space into a continuous domain suitable for gradient-based acquisition optimization while preserving the original variable structure.

If this is right

  • Enables sample-efficient search in high-cardinality discrete spaces typical of scientific parameter tuning.
  • Supports optimization of noisy and discontinuous objectives when the workflow is adjusted accordingly.
  • Provides a practical framework for autonomous laboratory settings with limited data and mixed variable types.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same relaxation idea could be tested with surrogate models other than Gaussian processes.
  • It may help with mixed-variable problems in domains outside natural sciences, such as engineering design.
  • Further scaling tests on problems with dozens of discrete levels would clarify limits.

Load-bearing premise

That benchmarks on synthetic and experimental objectives sufficiently demonstrate the method works robustly for real scientific tasks that include noise and discretization.

What would settle it

A mixed-variable scientific objective where the generalized method requires substantially more evaluations than standard discrete-handling techniques to reach comparable optima.

Figures

Figures reproduced from arXiv: 2604.07416 by Matthias Stosiek, Patrick Rinke, Ti John, Yuhao Zhang.

Figure 1
Figure 1. Figure 1: Synthetic BS benchmark problems within a function space of dimensionality and variable type where the fractions indicate the proportions of dimensions being continuous (C), integer (I) or discrete (D). As shown in [PITH_FULL_IMAGE:figures/full_fig_p006_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: PR using EI with the subtracted fitted noise hyperparameter (left, a–c) shows repeated sampling during [PITH_FULL_IMAGE:figures/full_fig_p011_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Plot showing the mean ranks in terms of composite score (as defined in Eg.7) of all models across each [PITH_FULL_IMAGE:figures/full_fig_p013_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Histogram illustrating absolute convergence performance, measured as the percentage of converged [PITH_FULL_IMAGE:figures/full_fig_p014_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Mean convergence plots of specific chosen models on the [PITH_FULL_IMAGE:figures/full_fig_p016_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Convergence plots of the BOSS_on_gam models on the DUST1 and DUST2 benchmark functions using only the penalty method (Red for the EI AF & Green for the LCB AF) versus the penalty + modified AF (mAF) approach (Blue & Magenta). Sobol sampling is shown in black, and the RF model in Maroon and Orange. The bold curves are the mean model convergences from their 10 different color-coded Sobol point initiated runs… view at source ↗
read the original abstract

Optimizing expensive black-box objectives over mixed search spaces is a common challenge across the natural sciences. Bayesian optimization (BO) offers sample-efficient strategies through probabilistic surrogate models and acquisition functions. However, its effectiveness diminishes in mixed or high-cardinality discrete spaces, where gradients are unavailable and optimizing the acquisition function becomes computationally demanding. In this work, we generalize the probabilistic reparameterization (PR) approach of Daulton et al. to handle non-equidistant discrete variables, enabling gradient-based optimization in fully mixed-variable settings with Gaussian process (GP) surrogates. With real-world scientific optimization tasks in mind, we conduct systematic benchmarks on synthetic and experimental objectives to obtain an optimized kernel formulations and demonstrate the robustness of our generalized PR method. We additionally show that, when combined with a modified BO workflow, our approach can efficiently optimize highly discontinuous and discretized objective landscapes. This work establishes a practical BO framework for addressing fully mixed optimization problems in the natural sciences, and is particularly well suited to autonomous laboratory settings where noise, discretization, and limited data are inherent.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

0 major / 2 minor

Summary. The manuscript generalizes the probabilistic reparameterization (PR) approach of Daulton et al. to non-equidistant discrete variables, enabling gradient-based optimization of acquisition functions within Gaussian process surrogates for fully mixed-variable Bayesian optimization problems. It reports systematic benchmarks on synthetic and experimental objectives drawn from the natural sciences, selection of optimized kernel formulations, and a modified BO workflow for handling highly discontinuous and discretized landscapes.

Significance. If the generalization and empirical results hold, the work supplies a practical, gradient-enabled BO framework for mixed-variable optimization tasks that frequently arise in scientific applications, including autonomous laboratory settings with noise, discretization, and limited data. The extension of PR to non-equidistant discretes, together with the reported benchmarks and workflow modification, represents a useful incremental contribution that can improve sample efficiency over standard mixed-variable methods.

minor comments (2)
  1. [Abstract] Abstract: the phrasing 'to obtain an optimized kernel formulations' contains a grammatical inconsistency (singular article with plural noun); revise for clarity.
  2. [Methods] The manuscript should include an explicit statement or small example (e.g., in the methods section) showing how the reparameterization map is constructed for a non-equidistant discrete variable with irregular spacing, to make the generalization immediately reproducible.

Simulated Author's Rebuttal

0 responses · 0 unresolved

We thank the referee for their positive and constructive review, which accurately summarizes our generalization of probabilistic reparameterization to non-equidistant discrete variables and its application to mixed-variable Bayesian optimization with Gaussian process surrogates. The recommendation for minor revision is appreciated, and we are pleased that the work is viewed as a useful incremental contribution for scientific applications. No specific major comments were raised in the report.

Circularity Check

0 steps flagged

No significant circularity detected

full rationale

The paper's central contribution is a generalization of the externally cited probabilistic reparameterization (PR) method from Daulton et al. to non-equidistant discrete variables, enabling gradient-based acquisition optimization with GP surrogates in mixed spaces. No load-bearing step reduces to a self-definition, a fitted parameter renamed as a prediction, or a self-citation chain; the derivation builds directly on independent prior work. Systematic benchmarks on synthetic and experimental objectives are presented as empirical validation rather than as the source of the method itself. The approach is self-contained against external benchmarks with no evident internal reduction of claimed results to inputs by construction.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The work rests on standard Bayesian optimization assumptions without introducing new free parameters or invented entities beyond the cited PR technique.

axioms (2)
  • domain assumption Gaussian process surrogates are suitable models for the black-box objectives under consideration
    Implicit in the use of GP-based BO throughout the abstract.
  • domain assumption The objectives are expensive black-box functions where sample efficiency matters
    Core premise of the Bayesian optimization setting described.

pith-pipeline@v0.9.0 · 5492 in / 1319 out tokens · 82667 ms · 2026-05-10T18:20:12.406818+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

51 extracted references · 36 canonical work pages · 2 internal anchors

  1. [2]

    Machine learning optimization of lignin properties in green biorefineries.ACS Sustainable Chemistry & Engineering, 10(29):9469–9479, 2022

    Joakim Löfgren, Dmitry Tarasov, Taru Koitto, Patrick Rinke, Mikhail Balakshin, and Milica Todorović. Machine learning optimization of lignin properties in green biorefineries.ACS Sustainable Chemistry & Engineering, 10(29):9469–9479, 2022. doi:10.1021/acssuschemeng.2c01895. URL https://doi.org/10.1021/acssuschemeng. 2c01895

  2. [4]

    Pedersen, Christian M

    Jack K. Pedersen, Christian M. Clausen, Olga A. Krysiak, Bin Xiao, Thomas A. A. Batchelor, Tobias Löffler, Vladislav A. Mints, Lars Banko, Matthias Arenz, Alan Savan, Wolfgang Schuhmann, Alfred Ludwig, and Jan Rossmeisl. Bayesian optimization of high-entropy alloy compositions for electrocatalytic oxygen reduction.Angewandte Chemie International Edition, ...

  3. [5]

    Data-efficient optimization of thermally-activated polymer actuators through machine learning.Materials & Design, 253:113908, 2025

    Yuhao Zhang, Maija Vaara, Azin Alesafar, Duc Bach Nguyen, Pedro Silva, Laura Koskelo, Jussi Ristolainen, Matthias Stosiek, Joakim Löfgren, Jaana Vapaavuori, and Patrick Rinke. Data-efficient optimization of thermally-activated polymer actuators through machine learning.Materials & Design, 253:113908, 2025. ISSN 0264-1275. doi:https://doi.org/10.1016/j.mat...

  4. [6]

    Flowers, Tobias H

    Alexey Sanin, Jackson K. Flowers, Tobias H. Piotrowiak, Frederic Felsen, Leon Merker, Alfred Ludwig, Dominic Bresser, and Helge Sören Stein. Integrating automated electrochem- istry and high-throughput characterization with machine learning to explore si—ge—sn thin-film lithium battery anodes.Advanced Energy Materials, 15(11):2404961, 2025. doi:https://do...

  5. [7]

    Enhancing lignin-carbohydrate complexes production and properties with machine learning.Chem- SusChem, 18(8):e202401711, 2025

    Daryna Diment, Joakim Löfgren, Marie Alopaeus, Matthias Stosiek, MiJung Cho, Chun- lin Xu, Michael Hummel, Davide Rigo, Patrick Rinke, and Mikhail Balakshin. Enhancing lignin-carbohydrate complexes production and properties with machine learning.Chem- SusChem, 18(8):e202401711, 2025. doi:https://doi.org/10.1002/cssc.202401711. URLhttps: //chemistry-europe...

  6. [8]

    Miranda-Valdez, Tero Mäkinen, Juha Koivisto, and Mikko J

    Isaac Y. Miranda-Valdez, Tero Mäkinen, Juha Koivisto, and Mikko J. Alava. Bayesian optimization to infer parameters in viscoelasticity.Journal of Rheology, 69(6):1059–1066, 10

  7. [9]

    doi:10.1122/8.0001068

    ISSN 0148-6055. doi:10.1122/8.0001068. URLhttps://doi.org/10.1122/8.0001068

  8. [10]

    Montgomery.Design and Analysis of Experiments

    D.C. Montgomery.Design and Analysis of Experiments. John Wiley & Sons, Incorporated,

  9. [11]

    URLhttps://books.google.fi/books?id=Py7bDgAAQBAJ

    ISBN 9781119113478. URLhttps://books.google.fi/books?id=Py7bDgAAQBAJ. 23

  10. [12]

    Bayesian optimization for adaptive experimental design: A review.IEEE Access, PP:1–1, 01

    Stewart Greenhill, Santu Rana, Sunil Gupta, Pratibha Vellanki, and Svetha Venkatesh. Bayesian optimization for adaptive experimental design: A review.IEEE Access, PP:1–1, 01

  11. [13]

    doi:10.1109/ACCESS.2020.2966228

  12. [14]

    Hartono, Anuj Goyal, Thomas Heumueller, Clio Batali, Alex Encinas, Jason J

    Shijing Sun, Armi Tiihonen, Felipe Oviedo, Zhe Liu, Janak Thapa, Yicheng Zhao, Noor Titan P. Hartono, Anuj Goyal, Thomas Heumueller, Clio Batali, Alex Encinas, Jason J. Yoo, Ruipeng Li, Zekun Ren, I. Marius Peters, Christoph J. Brabec, Moungi G. Bawendi, Vladan Stevanovic, John Fisher, and Tonio Buonassisi. A data fusion approach to optimize compositional...

  13. [15]

    Yifan Wu, Aron Walsh, and Alex M. Ganose. Race to the bottom: Bayesian optimisation for chemical problems.Digital Discovery, 3(6):1086–1100, 2024. doi:10.1039/D3DD00234A. URL http://dx.doi.org/10.1039/D3DD00234A

  14. [16]

    Bisbo and Bjørk Hammer

    Malthe K. Bisbo and Bjørk Hammer. Efficient global structure optimization with a machine-learned surrogate model.Phys. Rev. Lett., 124:086102, 02

  15. [17]

    URL https://link.aps.org/doi/10.1103/ PhysRevLett.124.086102

    doi:10.1103/PhysRevLett.124.086102. URL https://link.aps.org/doi/10.1103/ PhysRevLett.124.086102

  16. [18]

    Machine learning as a tool to engineer microstructures: Morphological prediction of tannin-based colloids using bayesian surrogate models.MRS Bulletin, 47(1):29–37, 2022

    Soo-AhJin, TeroKämäräinen, PatrickRinke, OrlandoJ.Rojas, andMilicaTodorović. Machine learning as a tool to engineer microstructures: Morphological prediction of tannin-based colloids using bayesian surrogate models.MRS Bulletin, 47(1):29–37, 2022. ISSN 1938-1425. doi:10.1557/s43577-021-00183-4. URLhttps://doi.org/10.1557/s43577-021-00183-4

  17. [19]

    Bayesianoptimisationfortheexperimental sciences: A practical guide to data-efficient optimisation of laboratory workflows.Advanced Intelligent Systems, n/a(n/a):e202501149, 2026

    ChuanHe, MartinSingull, andT.JesperJacobsson. Bayesianoptimisationfortheexperimental sciences: A practical guide to data-efficient optimisation of laboratory workflows.Advanced Intelligent Systems, n/a(n/a):e202501149, 2026. ISSN 2640-4567. doi:10.1002/aisy.202501149. URLhttps://doi.org/10.1002/aisy.202501149

  18. [20]

    Carl Edward Rasmussen and Christopher K. I. Williams.Gaussian Processes for Ma- chine Learning. MIT Press, Cambridge, MA, 2006. ISBN 026218253X. URL http: //www.GaussianProcess.org/gpml

  19. [21]

    Osborne, and Eytan Bakshy

    Samuel Daulton, Xingchen Wan, David Eriksson, Maximilian Balandat, Michael A. Osborne, and Eytan Bakshy. Bayesian optimization over discrete and mixed spaces via probabilistic reparameterization.arXiv preprint arXiv:2210.10199, 2022. URL https://arxiv.org/abs/ 2210.10199

  20. [22]

    Xingchen Wan, Vu Nguyen, Huong Ha, Binxin Ru, Cong Lu, and Michael A. Osborne. Think global and act local: Bayesian optimisation over high-dimensional categorical and mixed search spaces, 2021. URLhttps://arxiv.org/abs/2102.07188

  21. [23]

    Garrido-Merchán and Daniel Hernández-Lobato

    Eduardo C. Garrido-Merchán and Daniel Hernández-Lobato. Dealing with categorical and integer-valued variables in bayesian optimization with gaussian processes.Neurocomputing, 380:20–35, 2020. ISSN 0925-2312. doi:https://doi.org/10.1016/j.neucom.2019.11.004. URL https://www.sciencedirect.com/science/article/pii/S0925231219315619

  22. [24]

    Framework and benchmarks for combinatorial and mixed-variable bayesian optimization, 2023

    Kamil Dreczkowski, Antoine Grosnit, and Haitham Bou Ammar. Framework and benchmarks for combinatorial and mixed-variable bayesian optimization, 2023. URLhttps://arxiv.org/ abs/2306.09803

  23. [25]

    Ilya O. Ryzhov. On the convergence rates of expected improvement methods.Operations Research, 64(6):1515–1528, 2016. doi:10.1287/opre.2016.1537. 24

  24. [26]

    Philipp Hennig and Christian J. Schuler. Entropy search for information-efficient global optimization.Journal of Machine Learning Research, 13:1809–1837, 2012. URL https: //www.jmlr.org/papers/volume13/hennig12a/hennig12a.pdf

  25. [27]

    Peter I. Frazier. Bayesian optimization. InINFORMS Tutorials in Operations Research, pages 255–278. INFORMS, 2018. URLhttps://people.orie.cornell.edu/pfrazier/bo_ tutorial.pdf

  26. [28]

    Everson, Alma A

    George De Ath, Richard M. Everson, Alma A. M. Rahat, and Jonathan E. Fieldsend. Greed is good: Exploration and exploitation trade-offs in bayesian optimisation.arXiv preprint, abs/1911.12809, 2019. URLhttps://arxiv.org/abs/1911.12809

  27. [29]

    Extremely randomized trees.Machine Learning, 63(1):3–42, 2006

    Pierre Geurts, Damien Ernst, and Louis Wehenkel. Extremely randomized trees.Machine Learning, 63(1):3–42, 2006. ISSN 1573-0565. doi:10.1007/s10994-006-6226-1. URLhttps: //doi.org/10.1007/s10994-006-6226-1

  28. [30]

    Carola Lampe, Ioannis Kouroudis, Milan Harth, Stefan Martin, Alessio Gagliardi, and Alexander S. Urban. Rapid data-efficient optimization of perovskite nanocrystal syntheses through machine learning algorithm fusion.Advanced Materials, 35(16):2208772, 2023. doi:https://doi.org/10.1002/adma.202208772. URL https://advanced.onlinelibrary. wiley.com/doi/abs/1...

  29. [31]

    Bart: Bayesian additive regression trees,

    Hugh A. Chipman, Edward I. George, and Robert E. McCulloch. Bart: Bayesian additive regression trees.Annals of Applied Statistics, 4(1):266–298, 2010. doi:10.1214/09-AOAS285. URL https://doi.org/10.1214/09-AOAS285. Preprint available athttps://arxiv.org/ abs/0806.3286

  30. [32]

    Lee, Behrang Shafei, and Ruth Misener

    Toby Boyne, Jose Pablo Folch, Robert M. Lee, Behrang Shafei, and Ruth Misener. Bark: A fully bayesian tree kernel for black-box optimization.arXiv preprint arXiv:2503.05574,

  31. [33]

    Lee, Behrang Shafei, and Ruth Misener

    doi:10.48550/arXiv.2503.05574. URL https://arxiv.org/abs/2503.05574. Preprint available athttps://arxiv.org/abs/2503.05574

  32. [34]

    Apley, and Wei Chen

    Hengrui Zhang, Wei (Wayne) Chen, Akshay Iyer, Daniel W. Apley, and Wei Chen. Uncertainty- aware mixed-variable machine learning for materials design.Scientific Reports, 12(1):19760, Nov 2022. ISSN 2045-2322. doi:10.1038/s41598-022-23431-2. URL https://doi.org/10. 1038/s41598-022-23431-2

  33. [35]

    A comparison of mixed-variables bayesian optimization approaches

    Jhouben Cuesta Ramirez, Rodolphe Le Riche, Olivier Roustant, Guillaume Perrin, Cédric Durantin, and Alain Glière. A comparison of mixed-variables bayesian optimization approaches. Advanced Modeling and Simulation in Engineering Sciences, 9(1):6, 2022. doi:10.1186/s40323- 022-00218-8. URLhttps://doi.org/10.1186/s40323-022-00218-8

  34. [36]

    Shields, and Lori Graham-Brady

    Audrey Olivier, Michael D. Shields, and Lori Graham-Brady. Bayesian neural net- works for uncertainty quantification in data-driven materials modeling.Computer Methods in Applied Mechanics and Engineering, 386:114079, 2021. ISSN 0045-7825. doi:https://doi.org/10.1016/j.cma.2021.114079. URL https://www.sciencedirect.com/ science/article/pii/S0045782521004102

  35. [37]

    Allec and Maxim Ziatdinov

    Sarah I. Allec and Maxim Ziatdinov. Active and transfer learning with partially bayesian neural networks for materials and chemicals.Digital Discovery, 4:1284–1297, 2025. doi:10.1039/D5DD00027K. URLhttp://dx.doi.org/10.1039/D5DD00027K

  36. [38]

    Adam: A Method for Stochastic Optimization

    Diederik P. Kingma and Jimmy Ba. Adam: A method for stochastic optimization.CoRR, abs/1412.6980, 2014. URLhttps://api.semanticscholar.org/CorpusID:6628106. 25

  37. [39]

    Revisiting bayesian optimization in the light of the coco benchmark.Structural and Multidisciplinary Optimization, 64(5):3063–3087,

    Rodolphe Le Riche and Victor Picheny. Revisiting bayesian optimization in the light of the coco benchmark.Structural and Multidisciplinary Optimization, 64(5):3063–3087,

  38. [40]

    doi:10.1007/s00158-021-02977-1

    ISSN 1615-1488. doi:10.1007/s00158-021-02977-1. URLhttps://doi.org/10.1007/ s00158-021-02977-1

  39. [41]

    Bayesian Reaction Optimization as a Tool for Chemical Synthesis

    Benjamin J. Shields, Jason Stevens, Jun Li, Marvin Parasram, Farhan Damani, Jesus I.MartinezAlvarado, JacobM.Janey, RyanP.Adams, andAbigailG.Doyle. Bayesianreaction optimization as a tool for chemical synthesis.Nature, 590(7844):89–96, 2021. ISSN 1476-4687. doi:10.1038/s41586-021-03213-y. URLhttps://doi.org/10.1038/s41586-021-03213-y

  40. [42]

    Gutmann, Jukka Corander, and Patrick Rinke

    Milica Todorović, Michael U. Gutmann, Jukka Corander, and Patrick Rinke. Bayesian inference of atomistic structure in functional materials.npj Computational Materials, 5(1):35, 3 2019. ISSN 2057-3960. doi:10.1038/s41524-019-0175-2. URLhttps://doi.org/10.1038/ s41524-019-0175-2

  41. [43]

    Efficient amino acid conformer search with bayesian optimization.Journal of Chemical Theory and Computation, 17(3):1955–1966, 2021

    Lincan Fang, Esko Makkonen, Milica Todorović, Patrick Rinke, and Xi Chen. Efficient amino acid conformer search with bayesian optimization.Journal of Chemical Theory and Computation, 17(3):1955–1966, 2021. ISSN 1549-9618. doi:10.1021/acs.jctc.0c00648. URL https://doi.org/10.1021/acs.jctc.0c00648

  42. [44]

    Structural disorder by octahedral tilting in inorganic halide perovskites: New insight with bayesian optimization

    Jingrui Li, Fang Pan, Guo-Xu Zhang, Zenghui Liu, Hua Dong, Dawei Wang, Zhuangde Jiang, Wei Ren, Zuo-Guang Ye, Milica Todorović, and Patrick Rinke. Structural disorder by octahedral tilting in inorganic halide perovskites: New insight with bayesian optimization. Small Structures, 5(11):2400268, 2024. doi:https://doi.org/10.1002/sstr.202400268. URL https://...

  43. [46]

    Springer US, Boston, MA, 2009

    Lei Chen.Curse of Dimensionality, pages 545–546. Springer US, Boston, MA, 2009. ISBN 978-0-387-39940-9. doi:10.1007/978-0-387-39940-9_133. URL https://doi.org/10.1007/ 978-0-387-39940-9_133

  44. [47]

    More trustworthy bayesian optimization of materials properties by adding human into the loop

    Armi Tiihonen, Louis Filstroff, Petrus Mikkola, Emma Lehto, Samuel Kaski, Milica Todorović, and Patrick Rinke. More trustworthy bayesian optimization of materials properties by adding human into the loop. InAI for Accelerated Materials Design NeurIPS 2022 Workshop, 2022. URLhttps://openreview.net/forum?id=JQSzcd_Zc62

  45. [48]

    Carl Edward Rasmussen and Christopher K. I. Williams.Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA, 2006. ISBN 978-0-262-18253-9. URLhttp://www. gaussianprocess.org/gpml/

  46. [49]

    Additive gaussian processes

    David Duvenaud, Hannes Nickisch, and Carl Edward Rasmussen. Additive gaussian processes. InAdvances in Neural Information Processing Systems, volume 24, 2011. URL https://proceedings.neurips.cc/paper/2011/hash/ 7cce53cf90577442771720a370c3c723-Abstract.html

  47. [50]

    Microsoft COCO: Common Objects in Context

    Tsung-Yi Lin, Michael Maire, Serge Belongie, Lubomir Bourdev, Ross Girshick, James Hays, Pietro Perona, Deva Ramanan, C. Lawrence Zitnick, and Piotr Dollár. Microsoft coco: Common objects in context, 2015. URLhttps://arxiv.org/abs/1405.0312. 26

  48. [51]

    Benchmarking surrogate-based optimisation algorithms on expensive black- box functions.Applied Soft Computing, 147:110744, 2023

    Laurens Bliek, Arthur Guijt, Rickard Karlsson, Sicco Verwer, and Mathijs de Weerdt. Benchmarking surrogate-based optimisation algorithms on expensive black- box functions.Applied Soft Computing, 147:110744, 2023. ISSN 1568-4946. doi:https://doi.org/10.1016/j.asoc.2023.110744. URL https://www.sciencedirect.com/ science/article/pii/S1568494623007627

  49. [52]

    Vanilla Bayesian optimization performs great in high dimensions

    Carl Hvarfner, Erik Orm Hellsten, and Luigi Nardi. Vanilla Bayesian optimization performs great in high dimensions. In Ruslan Salakhutdinov, Zico Kolter, Katherine Heller, Adrian Weller, Nuria Oliver, Jonathan Scarlett, and Felix Berkenkamp, editors,Proceedings of the 41st International Conference on Machine Learning, volume 235 ofProceedings of Machine L...

  50. [53]

    Osborne, and Stephen Roberts

    Binxin Ru, Ahsan Alvi, Vu Nguyen, Michael A. Osborne, and Stephen Roberts. Bayesian optimisation over multiple continuous and categorical inputs. In Hal Daumé III and Aarti Singh, editors,Proceedings of the 37th International Conference on Machine Learning, volume 119 ofProceedings of Machine Learning Research, pages 8276–8285. PMLR, 13–18 Jul 2020. URLht...

  51. [54]

    Y-range tolerance

    John Cook. Determining distribution parameters from quantiles.UT MD Anderson Cancer Center Department of Biostatistics Working Paper Series, 01 2010. figures 27 Supporting Information for Bayesian Optimization with Generalized Probabilistic Reparameterization for Non-Uniform Mixed Spaces Yuhao Zhang1, Ti John2, Matthias Stosiek3, Patrick Rinke3 1Departmen...