pith. machine review for the scientific record. sign in

arxiv: 2604.15855 · v1 · submitted 2026-04-17 · 🌌 astro-ph.SR

Recognition: unknown

PISP: Projected-Space Inference of Stellar Parameters

Authors on Pith no claims yet

Pith reviewed 2026-05-10 08:11 UTC · model grok-4.3

classification 🌌 astro-ph.SR
keywords stellar parametersspectral inferencePCAactive subspaceAPOGEEneural network emulatorchemical abundances
0
0 comments X

The pith

Projecting spectra into an orthonormal basis before optimization improves accuracy of stellar temperature and abundance estimates.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper proposes PISP, a framework that projects high-dimensional stellar spectra onto a lower-dimensional orthonormal basis built with principal component analysis or active subspaces. Parameter optimization then occurs in this projected space, using either fixed dimensionality or L1 regularization to select directions, which reduces the influence of correlations among parameters such as effective temperature and elemental abundances. The method is implemented in two versions for different scales and tested with neural network spectral emulators on both synthetic Kurucz spectra and hundreds of thousands of APOGEE observations. Results show lower error scatters than direct baseline inference for multiple parameters, plus computational speedups in some setups. If correct, this approach could make large-scale spectroscopic surveys yield more precise stellar property catalogs.

Core claim

PISP constructs an orthonormal basis via PCA or the active-subspace method and optimizes the projection coefficients for stellar parameters either at a user-specified dimensionality or with L1 regularization for adaptive selection. Across four strategy combinations and two inference implementations, this projected-space approach reduces the standard deviation of differences between inferred and reference values for multiple parameters on both synthetic and real data, outperforming direct optimization in the original spectral space.

What carries the argument

Orthonormal basis from PCA or active subspaces, with Non-L1 fixed-dimensionality or L1-regularized coefficient optimization in the projected space.

If this is right

  • PCA-L1 reduces error scatter by at least 0.01 dex for 12 of 20 abundances in synthetic data, with larger gains for several elements.
  • PCA-Non-L1 lowers effective temperature error by more than 30 K and improves 9 of 17 abundances in observed APOGEE spectra.
  • Some PISP configurations deliver roughly 4 times faster inference while maintaining or improving accuracy.
  • The accuracy gains appear across both fully connected and residual neural network emulators.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same projection step could be tested on spectra from other instruments to check if similar error reductions occur outside APOGEE.
  • If the basis selection proves robust, the method might shorten the time needed to build parameter catalogs for next-generation surveys.
  • One could explore whether active-subspace bases outperform PCA when the underlying parameter sensitivities vary strongly with wavelength.

Load-bearing premise

The chosen orthonormal projection captures all information needed for parameter recovery without introducing new systematic biases from the dimensionality reduction itself.

What would settle it

Applying both baseline and all PISP variants to a new independent set of spectra and spectra and finding no reduction in error scatter for any parameter or strategy would show the projection does not improve inference.

Figures

Figures reproduced from arXiv: 2604.15855 by A-Li Luo, Hai-Ling Lu, Hao Zeng, Hugh R. A. Jones, Jun-Chao Liang, Ke-Fei Wu, Li-Li Wang, Ming-Hui Jia, Shu-Guo Ma, Shuo Li, Shuo Ye, Xiao Kong, Xiao-Xiao Ma, Yin-Bi Li, Zhi-Hua Zhong.

Figure 1
Figure 1. Figure 1: Distribution of spectral flux-reconstruction errors (RMSE) across different Teff ranges. The left and right panels show the RMSE histograms of the FNN- and ResNet-based spectral emulators on the test set, respectively. ance convergence stability and computational effi￾ciency (see Section 4.2.2). Satisfaction of any cri￾terion is taken as convergence, after which the op￾timal basis coefficients w ∗ i,1:n (i… view at source ↗
Figure 2
Figure 2. Figure 2: Hyperparameter settings (n and α) of PISP across different datasets. The first column lists the hyperparameters tuned on the Kurucz validation set, while the second column shows those optimized using the APOGEE DR17 reference set. The x-axis indicates the stellar parameters and the y-axis shows the standard deviation (σ) of the differences (∆) between inferred and reference values, with σ computed using as… view at source ↗
Figure 3
Figure 3. Figure 3: Hyperparameter settings (lr and ϵ) of PISP-Adam on the Kurucz validation set. The three panels correspond to optimization in the baseline strategy (Raw), PCA-Non-L1 (n=25), and PCA-L1 (α=7×10−4 ), respectively. Each curve represents a fixed convergence threshold ϵ ranging from 10−8 to 10−4 . The x-axis adopts a base-10 logarithmic scale to represent the learning rate lr, with tick labels indicating the cor… view at source ↗
Figure 4
Figure 4. Figure 4: Comparison of stellar parameters inferred by PISP-Adam and the baseline against Kurucz reference labels on the test set. All results are based on the FNN spectral emulator and the Adam optimizer. Each panel shows inferred versus reference labels: black squares correspond to the baseline strategy, while red circles represent the PCA-L1 strategy (α=7×10−4 ). Differences (∆=inferred−reference) statistics (µ, … view at source ↗
Figure 5
Figure 5. Figure 5: Standard deviations of differences relative to the Kurucz reference labels for PISP and baseline inferences on the Kurucz test set using the FNN spectral emulator. The 12 panels are organized into six Teff –log g subsets (rows), and the optimizer changes by column (left: CurveFit; right: Adam). In each panel, σ is computed from differences (∆=inferred−reference) for 25 labels using astropy.stats.sigma_clip… view at source ↗
Figure 6
Figure 6. Figure 6: Standard deviations of differences relative to the Kurucz reference labels for PISP and baseline inferences on the Kurucz test set using the ResNet spectral emulator. Same as [PITH_FULL_IMAGE:figures/full_fig_p015_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Fractional statistics of principal-component coefficients (w) with absolute values below 10−3 derived using PISP on the Kurucz test set. The first and second columns correspond to the PCA-Non-L1 and PCA-L1 strategies, respectively. The x-axis indicates the index of the principal component, and the y-axis shows the fraction of test samples with corresponding coefficients satisfying |w|<10−3 . The larger the… view at source ↗
Figure 8
Figure 8. Figure 8: Standard deviations of differences relative to the APOGEE official catalog for PISP and baseline inferences on APOGEE DR17 spectra using the FNN spectral emulator. The 12 panels are organized into six Teff –log g subsets (rows) and the optimizer changes by column (left: CurveFit; right: Adam). In each panel, σ is computed from differences (∆=inferred−reference) for 21 labels using astropy.stats.sigma_clipp… view at source ↗
Figure 9
Figure 9. Figure 9: Standard deviations of differences relative to the APOGEE official catalog for PISP and baseline inferences on APOGEE DR17 spectra using the ResNet spectral emulator. Same as [PITH_FULL_IMAGE:figures/full_fig_p019_9.png] view at source ↗
Figure 10
Figure 10. Figure 10: Fractional statistics of principal-component coefficients (w) with absolute values below 10−3 derived using PISP on the APOGEE spectra. The first and second columns correspond to the PCA-Non-L1 and PCA-L1 strategies, respectively. The x-axis indicates the index of the principal component and the y-axis shows the fraction of test samples with corresponding coefficients satisfying |w|<10−3 . The larger the … view at source ↗
Figure 11
Figure 11. Figure 11: Comparison between model errors (solid curves) and repeat-observation errors (dashed curves) as a function of S/N for the baseline strategy. All results are based on the ResNet spectral emulator and PISP-CurveFit. Errors are computed in S/N bins of width 5 and are reported only for bins containing at least three stars, with the median error shown on the vertical axis. For 12C/ 13C, the y-axis is base-10 l… view at source ↗
Figure 12
Figure 12. Figure 12: Comparison between model errors (solid curves) and repeat-observation errors (dashed curves) as a function of S/N for PISP. Same as [PITH_FULL_IMAGE:figures/full_fig_p022_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Standard deviations of differences relative to the Kurucz reference labels for PISP and baseline inferences on the test set obtained with different optimizers using the FNN spectral emulator. Each panel corresponds to a different optimizer, in￾cluding BFGS, L-BFGS, L-BFGS-B, trf, lm, dogbox, Adam with ReduceLROnPlateau, and Adam. In each panel, σ is computed from differences (∆=inferred−reference) for 25 … view at source ↗
Figure 14
Figure 14. Figure 14: Comparison of iteration counts between PISP-Adam and the baseline strategy for the inference of 25-dimensional stellar parameters from 722,896 APOGEE DR17 spectra. The left and right panels correspond to results using FNN and ResNet spectral emulators, respectively. The left panel includes 73 groups (72 with N=10,000 spectra and one with N=2896), while the right panel includes 145 groups (144 with N=5000 … view at source ↗
read the original abstract

To improve the accuracy and efficiency of high-dimensional stellar parameter inference in large spectroscopic datasets, we propose a projection-assisted parameter-inference framework -- Projected-Space Inference of Stellar Parameters (PISP). PISP constructs an orthonormal basis and optimizes in the projected space, reducing the impact of parameter correlations on inference. The basis is constructed using either principal component analysis (PCA) or the active-subspace (AS) method and is combined with two inference strategies -- Non-L1, which optimizes the projection coefficients for a user-specified projected dimensionality, and L1, which introduces L1 regularization in the full projected space to adaptively select projection directions -- yielding four strategies: PCA-Non-L1, AS-Non-L1, PCA-L1, and AS-L1. For different computational scenarios, we implement two versions: PISP-CurveFit for fast single-spectrum inference and PISP-Adam for large-scale GPU-parallel inference. Using a fully connected neural network and a residual network as spectral emulators, we evaluate PISP on Kurucz synthetic spectra and on $722{,}896$ APOGEE DR$17$ observed spectra. Compared to the baseline strategy, PISP improves inference accuracy for multiple parameters across all emulator-optimizer combinations. In synthetic data, PCA-L1 performs best, reducing the standard deviation of differences ($\sigma(\Delta)$) by at least $0.01$ dex for $12$ of $20$ elemental abundances, with [N/H], [O/H], [Na/H], [Co/H], [P/H], [V/H], [Cu/H] showing $0.05$--$0.72$ dex reductions. In observed data, PCA-Non-L1 reduces $\sigma(\Delta)$ by $>30$ K for effective temperature and by at least $0.01$ dex for $9$ of $17$ elemental abundances, with [O/H], [Na/H], [V/H] showing $0.05$--$0.20$ dex reductions, while achieving a $\sim$$4\times$ efficiency gain, slightly outperforming PCA-L1.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 1 minor

Summary. The paper introduces PISP, a projection-based framework for stellar parameter inference from spectra. It constructs an orthonormal basis via PCA or active subspaces, then applies either fixed-dimensionality (Non-L1) or L1-regularized optimization in the projected space. Four variants are tested with neural network emulators (fully connected and residual) on Kurucz synthetic spectra and 722,896 APOGEE DR17 spectra, claiming consistent reductions in σ(Δ) versus an unprojected baseline plus efficiency gains.

Significance. If the accuracy gains prove robust and free of projection-induced bias, PISP could meaningfully improve both precision and throughput for high-dimensional inference on large spectroscopic surveys. The reported ~4× efficiency improvement and gains on real APOGEE data are practically relevant.

major comments (2)
  1. [Abstract / Results] The central claim that PISP improves accuracy rests on reductions in σ(Δ). On observed APOGEE data, however, Δ is measured against an external catalog rather than ground truth; this metric cannot detect mean systematic offsets that the projection step might introduce (e.g., by discarding low-variance but parameter-informative directions). Synthetic-data results partially mitigate this, but the observed-data conclusions require additional bias diagnostics.
  2. [Abstract] No information is provided on baseline implementation details, number of optimization runs, statistical significance testing of the reported σ(Δ) differences, or cross-validation procedures. Without these, it is difficult to judge whether the claimed improvements (e.g., ≥0.01 dex for 12/20 abundances in synthetic data) are reliable or could arise from post-hoc selection.
minor comments (1)
  1. The four strategy names (PCA-Non-L1, AS-Non-L1, PCA-L1, AS-L1) are introduced in the abstract but would benefit from a concise table or diagram early in the methods section for quick reference.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the constructive and detailed review. The comments highlight important aspects of metric interpretation and experimental reproducibility. We address each major comment below and will revise the manuscript to incorporate the suggested improvements.

read point-by-point responses
  1. Referee: [Abstract / Results] The central claim that PISP improves accuracy rests on reductions in σ(Δ). On observed APOGEE data, however, Δ is measured against an external catalog rather than ground truth; this metric cannot detect mean systematic offsets that the projection step might introduce (e.g., by discarding low-variance but parameter-informative directions). Synthetic-data results partially mitigate this, but the observed-data conclusions require additional bias diagnostics.

    Authors: We agree that σ(Δ) captures scatter but not potential mean biases that could arise if the projection discards informative directions. In the synthetic experiments (where ground truth is known), the mean differences (Δ) for PISP variants remain small and comparable to or smaller than the baseline, with no evidence of introduced systematics. For the APOGEE results, we will add explicit reporting of mean(Δ) alongside σ(Δ) for all parameters in the revised manuscript. This will allow readers to verify the absence of projection-induced offsets and strengthen the observed-data claims. revision: yes

  2. Referee: [Abstract] No information is provided on baseline implementation details, number of optimization runs, statistical significance testing of the reported σ(Δ) differences, or cross-validation procedures. Without these, it is difficult to judge whether the claimed improvements (e.g., ≥0.01 dex for 12/20 abundances in synthetic data) are reliable or could arise from post-hoc selection.

    Authors: We acknowledge that these details are essential for assessing robustness. The revised manuscript will expand the Methods and Experiments sections to specify: (i) the exact baseline implementation (unprojected optimization with identical emulator and optimizer settings), (ii) the number of random initializations or optimization runs performed per spectrum, (iii) the statistical tests used to evaluate σ(Δ) differences (e.g., paired Wilcoxon or bootstrap confidence intervals), and (iv) the cross-validation procedure for training the neural-network emulators. These additions will demonstrate that the reported gains are not the result of post-hoc selection. revision: yes

Circularity Check

0 steps flagged

No significant circularity; empirical gains are measured against an independent baseline

full rationale

The paper defines PISP as a projection framework (PCA/AS basis + L1/Non-L1 optimization) and reports empirical accuracy gains on held-out synthetic spectra and APOGEE observations relative to an unprojected baseline. No equation or result is shown to equal its own fitted inputs by construction; the reported σ(Δ) reductions are external performance metrics, not tautological re-expressions of the projection step itself. The method description is self-contained and does not rely on self-citations for its central claims.

Axiom & Free-Parameter Ledger

1 free parameters · 1 axioms · 0 invented entities

The framework rests on the assumption that linear dimensionality reduction suffices to decorrelate stellar parameters and that the resulting low-dimensional optimization yields unbiased recoveries; no new physical entities are postulated.

free parameters (1)
  • projected dimensionality
    User-specified in Non-L1 or implicitly selected by L1 regularization strength; directly controls the size of the optimization space and therefore the reported accuracy gains.
axioms (1)
  • domain assumption Stellar spectra variations lie approximately in a low-dimensional linear subspace that can be recovered by PCA or active-subspace methods.
    Invoked when the orthonormal basis is constructed to reduce parameter correlations.

pith-pipeline@v0.9.0 · 5748 in / 1359 out tokens · 64101 ms · 2026-05-10T08:11:32.682713+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

71 extracted references · 65 canonical work pages · 2 internal anchors

  1. [1]

    flux densities

    thebibliography [1] 20pt to REFERENCES 6pt =0pt -12pt 10pt plus 3pt =0pt =0pt =1pt plus 1pt =0pt =0pt -12pt =13pt plus 1pt =20pt =13pt plus 1pt \@M =10000 =-1.0em =0pt =0pt 0pt =0pt =1.0em @enumiv\@empty 10000 10000 `\.\@m \@noitemerr \@latex@warning Empty `thebibliography' environment \@ifnextchar \@reference \@latexerr Missing key on reference command E...

  2. [2]

    Abdi, H., & Williams, L. J. 2010, WIREs Computational Statistics, 2, 433, 10.1002/wics.101

  3. [3]

    , keywords =

    Abdurro'uf , Accetta , K., Aerts , C., et al. 2022, , 259, 35, 10.3847/1538-4365/ac4414

  4. [4]

    F., Argudo-Fern´ andez, M., et al

    Almeida , A., Anderson , S. F., Argudo-Fern \'a ndez , M., et al. 2023, , 267, 44, 10.3847/1538-4365/acda98

  5. [5]

    2022, in Parallel Problem Solving from Nature -- PPSN XVII, ed

    Antonov, K., Raponi, E., Wang, H., & Doerr, C. 2022, in Parallel Problem Solving from Nature -- PPSN XVII, ed. G. Rudolph, A. V. Kononova, H. Aguirre, P. Kerschke, G. Ochoa, & T. Tu s ar (Cham: Springer International Publishing), 118--131, 10.1007/978-3-031-14714-2_9

  6. [6]

    J., & Scott, P

    Asplund , M., Grevesse , N., Sauval , A. J., & Scott , P. 2009, , 47, 481, 10.1146/annurev.astro.46.060407.145222

  7. [7]

    Bensby , T., Feltzing , S., & Oey , M. S. 2014, , 562, A71, 10.1051/0004-6361/201322631

  8. [8]

    R., Bershady, M

    Blanton , M. R., Bershady , M. A., Abolfathi , B., et al. 2017, , 154, 28, 10.3847/1538-3881/aa7567

  9. [9]

    2024, Neural Computation, 36, 2446, 10.1162/neco_a_01708

    Boyar, O., & Takeuchi, I. 2024, Neural Computation, 36, 2446, 10.1162/neco_a_01708

  10. [10]

    A., Coleman , T

    Branch , M. A., Coleman , T. F., & Li , Y. 1999, SIAM Journal on Scientific Computing, 21, 1, 10.1137/S1064827595289108

  11. [11]

    , keywords =

    Buder , S., Asplund , M., Duong , L., et al. 2018, , 478, 4513, 10.1093/mnras/sty1281

  12. [12]

    2021, MNRAS, 506, 150, doi: 10.1093/mnras/stab1242

    Buder , S., Sharma , S., Kos , J., et al. 2021, , 506, 150, 10.1093/mnras/stab1242

  13. [13]

    Publications of the Astronomical Society of Australia , author =

    Buder , S., Kos , J., Wang , X. E., et al. 2025, , 42, e051, 10.1017/pasa.2025.26

  14. [14]

    Byrd, Peihuang Lu, Jorge Nocedal, and Ciyou Zhu

    Byrd , R. H., Lu , P., Nocedal , J., & Zhu , C. 1995, SIAM Journal on Scientific Computing, 16, 1190, 10.1137/0916069

  15. [15]

    R., Lattanzio , J

    Casey , A. R., Lattanzio , J. C., Aleti , A., et al. 2019, , 887, 73, 10.3847/1538-4357/ab4fea

  16. [16]

    Chu, J., Park, J., Lee, S., & Kim, H. J. 2024, in Advances in Neural Information Processing Systems, ed. A. Globerson, L. Mackey, D. Belgrave, A. Fan, U. Paquet, J. Tomczak, & C. Zhang, Vol. 37 (Curran Associates, Inc.), 68258--68286, 10.52202/079017-2180

  17. [17]

    F., & Li, Y

    Coleman, T. F., & Li, Y. 1996, SIAM Journal on Optimization, 6, 418, 10.1137/0806023

  18. [18]

    G., Dow , E., & Wang , Q

    Constantine , P. G., Dow , E., & Wang , Q. 2014, SIAM Journal on Scientific Computing, 36, A1500, 10.1137/130916138

  19. [19]

    2012, Research in Astronomy and Astrophysics, 12, 1197, doi: 10.1088/1674-4527/12/9/003

    Cui , X.-Q., Zhao , Y.-H., Chu , Y.-Q., et al. 2012, Research in Astronomy and Astrophysics, 12, 1197, 10.1088/1674-4527/12/9/003

  20. [20]

    , keywords =

    De Silva , G. M., Freeman , K. C., Bland-Hawthorn , J., et al. 2015, , 449, 2604, 10.1093/mnras/stv327

  21. [21]

    Data Release 1 of the Dark Energy Spectroscopic Instrument

    DESI Collaboration , Abdul-Karim , M., Adame , A. G., et al. 2025, arXiv e-prints, arXiv:2503.14745, 10.48550/arXiv.2503.14745

  22. [22]

    Gaia Collaboration , Vallenari , A., Brown , A. G. A., et al. 2023, , 674, A1, 10.1051/0004-6361/202243940

  23. [23]

    E., Allende Prieto , C., Holtzman , J

    Garc \' a P \'e rez , A. E., Allende Prieto , C., Holtzman , J. A., et al. 2016, , 151, 144, 10.3847/0004-6256/151/6/144

  24. [24]

    J., Hogg , D

    Griffith , E. J., Hogg , D. W., Hasselquist , S., et al. 2025, , 169, 280, 10.3847/1538-3881/adc07f

  25. [25]

    J., Weinberg , D

    Griffith , E. J., Weinberg , D. H., Buder , S., et al. 2022, , 931, 23, 10.3847/1538-4357/ac5826

  26. [26]

    R., Millman, K

    Harris , C. R., Millman , K. J., van der Walt , S. J., et al. 2020, , 585, 357, 10.1038/s41586-020-2649-2

  27. [27]

    Hotelling

    Hotelling, H. 1933, Journal of Educational Psychology, 24, 417, 10.1037/h0071325

  28. [28]

    A., Yan, R., et al

    Imig , J., Holtzman , J. A., Yan , R., et al. 2022, , 163, 56, 10.3847/1538-3881/ac3ca7

  29. [29]

    Adam: A Method for Stochastic Optimization

    Kingma , D. P., & Ba , J. 2014, arXiv e-prints, arXiv:1412.6980, 10.48550/arXiv.1412.6980

  30. [30]

    2020, in Advances in Neural Information Processing Systems, ed

    Letham, B., Calandra, R., Rai, A., & Bakshy, E. 2020, in Advances in Neural Information Processing Systems, ed. H. Larochelle, M. Ranzato, R. Hadsell, M. Balcan, & H. Lin, Vol. 33 (Curran Associates, Inc.), 1546--1558. https://proceedings.neurips.cc/paper_files/paper/2020/file/10fb6cfa4c990d2bad5ddef4f70e8ba2-Paper.pdf

  31. [31]

    2022, , 163, 153, 10.3847/1538-3881/ac4d97

    Liang , J., Bu , Y., Tan , K., et al. 2022, , 163, 153, 10.3847/1538-3881/ac4d97

  32. [32]

    2026, , 996, 97, 10.3847/1538-4357/ae1446

    Liang , J.-C., Li , Y.-B., Luo , A.-L., et al. 2026, , 996, 97, 10.3847/1538-4357/ae1446

  33. [33]

    L., Zhao, Y.-H., Zhao, G., et al

    Luo , A.-L., Zhao , Y.-H., Zhao , G., et al. 2015, Research in Astronomy and Astrophysics, 15, 1095, 10.1088/1674-4527/15/8/002

  34. [34]

    R., Schiavon , R

    Majewski , S. R., Schiavon , R. P., Frinchaboy , P. M., et al. 2017, , 154, 94, 10.3847/1538-3881/aa784d

  35. [35]

    L., Sharma , S., Buder , S., et al

    Martell , S. L., Sharma , S., Buder , S., et al. 2017, , 465, 3203, 10.1093/mnras/stw2835

  36. [36]

    2022, in Advances in Neural Information Processing Systems, ed

    Maus, N., Jones, H., Moore, J., et al. 2022, in Advances in Neural Information Processing Systems, ed. S. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, & A. Oh, Vol. 35 (Curran Associates, Inc.), 34505--34518, 10.52202/068431-2500

  37. [37]

    Mead , J., De La Garza Evia , R., & Ness , M. K. 2025, , 994, 52, 10.3847/1538-4357/ae0b64

  38. [38]

    2021, arXiv e-prints, arXiv:2105.03687, 10.48550/arXiv.2105.03687

    Mei , Y., & Wang , H. 2021, arXiv e-prints, arXiv:2105.03687, 10.48550/arXiv.2105.03687

  39. [39]

    Mor \'e , J. J. 1978, in Lecture Notes in Mathematics, Berlin Springer Verlag, Vol. 630, 105--116, 10.1007/BFb0067700

  40. [40]

    2024, arXiv e-prints, arXiv:2401.02735, 10.48550/arXiv.2401.02735

    Musayeva , K., & Binois , M. 2024, arXiv e-prints, arXiv:2401.02735, 10.48550/arXiv.2401.02735

  41. [41]

    2015, ApJ, 808, 16, doi: 10.1088/0004-637X/808/1/16 O’Briain, T., Ting, Y.-S., Fabbro, S., et al

    Ness , M., Hogg , D. W., Rix , H.-W., Ho , A. Y. Q., & Zasowski , G. 2015, , 808, 16, 10.1088/0004-637X/808/1/16

  42. [42]

    K., Wheeler , A

    Ness , M. K., Wheeler , A. J., McKinnon , K., et al. 2022, , 926, 144, 10.3847/1538-4357/ac4754

  43. [43]

    E., & Gustafsson , B

    Nissen , P. E., & Gustafsson , B. 2018, , 26, 6, 10.1007/s00159-018-0111-3

  44. [44]

    Nocedal, J., & Wright, S. J. 2006, Numerical optimization (Springer New York), 10.1007/978-0-387-40065-5

  45. [45]

    2021, , 906, 130, 10.3847/1538-4357/abca96

    O'Briain , T., Ting , Y.-S., Fabbro , S., et al. 2021, , 906, 130, 10.3847/1538-4357/abca96

  46. [46]

    2019, in Advances in Neural Information Processing Systems, ed

    Paszke, A., Gross, S., Massa, F., et al. 2019, in Advances in Neural Information Processing Systems, ed. H. Wallach, H. Larochelle, A. Beygelzimer, F. d Alch\' e -Buc, E. Fox, & R. Garnett, Vol. 32 (Curran Associates, Inc.). https://proceedings.neurips.cc/paper_files/paper/2019/file/bdbca288fee7f92f2bfa9f7012727740-Paper.pdf

  47. [47]

    2011, Journal of Machine Learning Research, 12, 2825

    Pedregosa, F., Varoquaux, G., Gramfort, A., et al. 2011, Journal of Machine Learning Research, 12, 2825. http://jmlr.org/papers/v12/pedregosa11a.html

  48. [48]

    Piskunov , N., & Valenti , J. A. 2017, , 597, A16, 10.1051/0004-6361/201629124

  49. [49]

    Pretsch, L., Arsenyev, I., Raponi, E., & Duddeck, F. 2024, in Volume 12D: Turbomachinery — Multidisciplinary Design Approaches, Optimization, and Uncertainty Quantification; Radial Turbomachinery Aerodynamics; Unsteady Flows in Turbomachinery, GT2024 (American Society of Mechanical Engineers), 10.1115/gt2024-121848

  50. [50]

    , keywords =

    Ram \' rez , I., & Allende Prieto , C. 2011, , 743, 135, 10.1088/0004-637X/743/2/135

  51. [51]

    2020, in Parallel Problem Solving from Nature -- PPSN XVI, ed

    Raponi, E., Wang, H., Bujny, M., Boria, S., & Doerr, C. 2020, in Parallel Problem Solving from Nature -- PPSN XVI, ed. T. B \"a ck, M. Preuss, A. Deutz, H. Wang, C. Doerr, M. Emmerich, & H. Trautmann (Cham: Springer International Publishing), 169--183, 10.1007/978-3-030-58112-1_12

  52. [52]

    A., et al

    Recio-Blanco , A., de Laverny , P., Palicio , P. A., et al. 2023, , 674, A29, 10.1051/0004-6361/202243750

  53. [53]

    2025, , 980, 66, 10.3847/1538-4357/ad9b99

    R \'o \.z a \'n ski , T., Ting , Y.-S., & Jab o \'n ska , M. 2025, , 980, 66, 10.3847/1538-4357/ad9b99

  54. [54]

    V., Bizyaev, D., Cunha, K., et al

    Smith , V. V., Bizyaev , D., Cunha , K., et al. 2021, , 161, 254, 10.3847/1538-3881/abefdc

  55. [55]

    doi:10.1086/506564

    Steinmetz , M., Zwitter , T., Siebert , A., et al. 2006, , 132, 1645, 10.1086/506564

  56. [56]

    2019, ApJ, 879, 69, doi: 10.3847/1538-4357/ab2331

    Ting , Y.-S., Conroy , C., Rix , H.-W., & Cargile , P. 2019, , 879, 69, 10.3847/1538-4357/ab2331

  57. [57]

    Ting , Y.-S., & Weinberg , D. H. 2022, , 927, 209, 10.3847/1538-4357/ac5023

  58. [58]

    Tripp, A., Daxberger, E., & Hern\' a ndez-Lobato, J. M. 2020, in Advances in Neural Information Processing Systems, ed. H. Larochelle, M. Ranzato, R. Hadsell, M. Balcan, & H. Lin, Vol. 33 (Curran Associates, Inc.), 11259--11272. https://proceedings.neurips.cc/paper_files/paper/2020/file/81e3225c6ad49623167a4309eb4b2e75-Paper.pdf

  59. [59]

    A., & Piskunov , N

    Valenti , J. A., & Piskunov , N. 1996, , 118, 595

  60. [60]

    2024, joblib/joblib: 1.4.2 , 1.4.2, Zenodo, 10.5281/zenodo.14915602

    Varoquaux , G., Grisel , O., Est \`e ve , L., et al. 2024, joblib/joblib: 1.4.2 , 1.4.2, Zenodo, 10.5281/zenodo.14915602

  61. [61]

    E., et al

    Virtanen , P., Gommers , R., Oliphant , T. E., et al. 2020, Nature Medicine, 17, 261, 10.1038/s41592-019-0686-2

  62. [62]

    Voglis , C., & Lagaris , I. E. 2004, in WSEAS International Conference on Applied Mathematics, Corfu, Greece. https://www.wseas.us/e-library/conferences/corfu2004/papers/488-317.pdf

  63. [63]

    Confidence region estimation techniques for nonlinear regression in ground- water flow: Three case studies

    Vugrin , K. W., Swiler , L. P., Roberts , R. M., Stucky-Mack , N. J., & Sullivan , S. P. 2007, Water Resources Research, 43, W03423, 10.1029/2005WR004804

  64. [64]

    2016, Journal of Artificial Intelligence Research, 55, 361, 10.1613/jair.4806

    Wang, Z., Hutter, F., Zoghi, M., Matheson, D., & De Feitas, N. 2016, Journal of Artificial Intelligence Research, 55, 361, 10.1613/jair.4806

  65. [65]

    , keywords =

    Weinberg , D. H., Holtzman , J. A., Hasselquist , S., et al. 2019, , 874, 102, 10.3847/1538-4357/ab07c7

  66. [66]

    H., Holtzman , J

    Weinberg , D. H., Holtzman , J. A., Johnson , J. A., et al. 2022, , 260, 32, 10.3847/1538-4365/ac6028

  67. [67]

    2019, ApJS, 245, 34, doi: 10.3847/1538-4365/ab5364

    Xiang , M., Ting , Y.-S., Rix , H.-W., et al. 2019, , 245, 34, 10.3847/1538-4365/ab5364

  68. [68]

    SEGUE: A Spectroscopic Survey of 240,000 stars with g=14-20

    Yanny , B., Rockosi , C., Newberg , H. J., et al. 2009, , 137, 4377, 10.1088/0004-6256/137/5/4377

  69. [69]

    M., & Rix , H.-W

    Zhang , X., Green , G. M., & Rix , H.-W. 2023, , 524, 1855, 10.1093/mnras/stad1941

  70. [70]

    Research in Astronomy and Astrophysics , year = 2012, month = jul, volume =

    Zhao , G., Zhao , Y.-H., Chu , Y.-Q., Jing , Y.-P., & Deng , L.-C. 2012, Research in Astronomy and Astrophysics, 12, 723, 10.1088/1674-4527/12/7/002

  71. [71]

    and Lu, Peihuang and Nocedal, Jorge , title =

    Zhu, C., Byrd, R. H., Lu, P., & Nocedal, J. 1997, ACM Trans. Math. Softw., 23, 550–560, 10.1145/279232.279236