pith. machine review for the scientific record. sign in

arxiv: 2605.13973 · v1 · submitted 2026-05-13 · 🌌 astro-ph.GA

Recognition: 2 theorem links

· Lean Theorem

Determining star formation histories and age-metallicity relations with convolutional neural networks

Authors on Pith no claims yet

Pith reviewed 2026-05-15 05:07 UTC · model grok-4.3

classification 🌌 astro-ph.GA
keywords star formation historyage-metallicity relationconvolutional neural networkgalaxy stellar populationsPHANGS surveyintegral field spectroscopyphotometrymachine learning
0
0 comments X

The pith

A convolutional neural network recovers star formation histories and age-metallicity relations from combined spectra and photometry with negligible bias.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper trains a convolutional neural network on 165,000 synthetic spectra and photometric measurements to jointly infer star formation histories and metallicities in 16 age bins from PHANGS-MUSE spectroscopy and PHANGS-HST photometry. The network uses convolutional layers and attention mechanisms in a shared latent space to mitigate classical degeneracies while handling realistic noise and dust effects. It produces luminosity- and mass-weighted mean ages and metallicities with dispersions of about 0.12 dex in age and 0.03 dex in metallicity. The approach is thousands of times faster than traditional spectral fitting, enabling efficient analysis of large nearby-galaxy surveys.

Core claim

The CNN accurately recovers SFHs and age-metallicity relations over a wide range of evolutionary scenarios. The inferred luminosity- and mass-weighted mean ages and metallicities show negligible bias, with dispersions of ∼0.12 dex in age and ∼0.03 dex in metallicity. When applied to real PHANGS-MUSE and PHANGS-HST data for NGC 3627, the network produces smooth, spatially coherent maps of stellar age and metallicity that recover physically meaningful structures, including younger populations tracing the spiral arms and star-forming regions.

What carries the argument

A convolutional neural network with convolutional layers, attention mechanisms, and a shared latent space that jointly processes integral-field spectra and five-band photometry to predict star formation histories in 16 age bins along with metallicities.

If this is right

  • The network yields smooth, spatially coherent maps that trace younger stellar populations along spiral arms and star-forming regions in galaxies like NGC 3627.
  • Mean ages and metallicities are recovered with negligible bias across a broad range of SFH shapes and metallicity evolutions.
  • The method runs 5,000 to 20,000 times faster than conventional full spectral fitting, making it practical for large spectro-photometric surveys.
  • Joint use of spectroscopy and photometry improves constraints on spatially resolved star formation and metallicity evolution.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same architecture could be retrained on data from other integral-field surveys to map stellar populations across a wider range of galaxy types and environments.
  • Faster inference opens the possibility of applying detailed SFH recovery to statistically large samples rather than a handful of well-studied objects.
  • The shared latent space might allow the network to generalize to missing data modes, such as photometry-only or lower-resolution spectra, for incomplete observations.

Load-bearing premise

The 165,000 synthetic spectra and photometric measurements fully capture the degeneracies, dust attenuation, noise properties, and instrumental effects present in real observations.

What would settle it

Compare the CNN-derived age and metallicity maps for NGC 3627 directly against independent maps produced by traditional full spectral fitting codes on the same PHANGS-MUSE and PHANGS-HST data.

Figures

Figures reproduced from arXiv: 2605.13973 by Artemi Camps-Fari\~na, Daniel A. Dale, Enrique Galceran, Francesca Pinna, Francesco Belfiore, Hsi-An Pan, Ivan S. Gerasimov, M\'ed\'eric Boquien, Patricia S\'anchez-Bl\'azquez, Ralf S. Klessen, Thomas G. Williams.

Figure 1
Figure 1. Figure 1: Distribution of optical depths for the di [PITH_FULL_IMAGE:figures/full_fig_p004_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Schematic representation of our CNN Network architecture as described in Section [PITH_FULL_IMAGE:figures/full_fig_p006_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Four representative examples of the recovered SFH and age–metallicity relation compared to the ground truth, using our [PITH_FULL_IMAGE:figures/full_fig_p008_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Mean difference between the input and the predicted SFR as a function of Age, normalised by the mean SFR in each bin. The error bars represent the standard deviation against the mean. where wi represents either the mass fraction (for mass-weighted quantities, XW = MW) or the V-band light fraction (for light￾weighted quantities, XW = LW) contributed by the stellar popu￾lation in age bin i. Although the mass… view at source ↗
Figure 5
Figure 5. Figure 5: Comparison between the ground truth mean mass- and luminosity-weighted ages (top row) and metallicities (bottom row) [PITH_FULL_IMAGE:figures/full_fig_p010_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Differences between the input and predicted mean log(age) as a function of the corresponding differences in [Z/H]. Points are colour-coded by the ground-truth luminosity￾weighted metallicity as indicated in the insets. The top (bot￾tom) panels show luminosity-weighted (mass-weighted) quanti￾ties. Left panels display the results obtained with the CNN, while the right panels show those derived using pPXF. di… view at source ↗
Figure 8
Figure 8. Figure 8: Violin plots showing the distributions of the di [PITH_FULL_IMAGE:figures/full_fig_p011_8.png] view at source ↗
Figure 9
Figure 9. Figure 9: Impact of removing photometric fluxes from the input data on the recovery of the light-weighted age. Each panel shows the [PITH_FULL_IMAGE:figures/full_fig_p012_9.png] view at source ↗
Figure 10
Figure 10. Figure 10: Impact of different spectral regions on the LW-age predictions. Each panel shows the difference between the ground￾truth and predicted ⟨log(Age/yr)⟩LW for different perturbations of the input spectrum. The first nine panels (from left to right and top to bottom) show the results obtained when individual wavelength segments are replaced by random values drawn from the empirical distribution of the same seg… view at source ↗
Figure 11
Figure 11. Figure 11: Same as Fig [PITH_FULL_IMAGE:figures/full_fig_p014_11.png] view at source ↗
Figure 12
Figure 12. Figure 12: Impact of different spectral regions on the predictions of the mean LW metallicity. The different panels are equivalent to those in [PITH_FULL_IMAGE:figures/full_fig_p014_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Top row: maps of the luminosity- and mass-weighted stellar ages and metallicities predicted by the CNN for NGC 3627. [PITH_FULL_IMAGE:figures/full_fig_p015_13.png] view at source ↗
Figure 14
Figure 14. Figure 14: Comparison of luminosity- and mass-weighted mean [PITH_FULL_IMAGE:figures/full_fig_p016_14.png] view at source ↗
read the original abstract

We aim to develop a state-of-the-art tool to infer detailed star formation histories (SFHs) and age-metallicity relations from realistic observational data, while mitigating classical degeneracies and substantially reducing computational cost. In particular, we seek to exploit the complementarity of spectroscopic and photometric data to improve constraints on the spatially resolved SFH and metallicity evolution of nearby galaxies in the PHANGS collaboration. We construct and train a convolutional neural network (CNN) that combines convolutional layers, attention mechanisms, and a shared latent space to jointly predict SFHs and metallicities in 16 age bins. The network simultaneously processes integral-field spectroscopic data from PHANGS-MUSE and five-band photometric fluxes from PHANGS-HST. Training is performed on a dataset of 165\,000 synthetic spectra and photometric measurements spanning a broad range of SFH shapes, metallicity evolution, dust attenuation, and signal-to-noise ratios representative of the observations. The CNN accurately recovers SFHs and age-metallicity relations over a wide range of evolutionary scenarios. The inferred luminosity- and mass-weighted mean ages and metallicities show negligible bias, with dispersions of $\sim0.12$ dex in age and $\sim0.03$ dex in metallicity. When applied to real PHANGS-MUSE and PHANGS-HST data for NGC\,3627, the network produces smooth, spatially coherent maps of stellar age and metallicity that recover physically meaningful structures, including younger populations tracing the spiral arms and star-forming regions. The CNN is approximately $5\times10^{3}$--$2\times10^{4}$ times faster than traditional full spectral fitting codes, providing a powerful and efficient alternative for the analysis of large spectro-photometric surveys.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The manuscript presents a convolutional neural network (CNN) trained on 165,000 synthetic spectra and five-band photometry to jointly recover star formation histories (SFHs) in 16 age bins and age-metallicity relations from combined PHANGS-MUSE spectroscopy and PHANGS-HST photometry. The network uses convolutional layers and attention mechanisms with a shared latent space. On held-out synthetic data it reports negligible bias and dispersions of ~0.12 dex in luminosity- and mass-weighted mean ages and ~0.03 dex in metallicities. When applied to NGC 3627 the CNN produces spatially coherent maps that recover expected structures such as younger populations along spiral arms.

Significance. If the synthetic recovery metrics generalize, the method would offer a 5,000- to 20,000-fold speed-up over traditional full spectral fitting while exploiting spectro-photometric complementarity to reduce classical degeneracies, enabling efficient analysis of large IFU surveys.

major comments (2)
  1. [Application to NGC 3627 (results section)] The central claim that the CNN accurately recovers SFHs and age-metallicity relations on real data rests on visual inspection of NGC 3627 maps only. No quantitative comparison (e.g., pixel-by-pixel residuals or recovered mean ages/metallicities) against traditional fitting codes applied to the identical real spectra is reported, leaving open whether the quoted 0.12/0.03 dex dispersions survive unmodeled systematics such as residual sky lines, spatially varying instrumental response, or complex dust geometries.
  2. [Methods (training dataset and network architecture)] The training description states that the 165,000 synthetics span a broad range of SFH shapes, metallicities, dust attenuation, and SNR, but provides insufficient detail on the train/validation/test split ratios, how noise properties were matched to real MUSE/HST observations, or any post-training generalization tests on out-of-distribution synthetics. These elements are load-bearing for the claim of negligible bias across evolutionary scenarios.
minor comments (2)
  1. [Abstract and discussion] The speed-up range (5e3 to 2e4) is stated without specifying the hardware, number of age bins, or exact traditional code used for the comparison; a single benchmark table would clarify the claim.
  2. [Methods] Notation for the 16 age bins and the precise definition of luminosity- versus mass-weighted means should be given explicitly in the methods to allow direct reproduction of the reported dispersions.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their thoughtful and constructive comments. We have revised the manuscript to provide additional details on the training procedure and to strengthen the discussion of real-data validation. Below we respond to each major comment.

read point-by-point responses
  1. Referee: [Application to NGC 3627 (results section)] The central claim that the CNN accurately recovers SFHs and age-metallicity relations on real data rests on visual inspection of NGC 3627 maps only. No quantitative comparison (e.g., pixel-by-pixel residuals or recovered mean ages/metallicities) against traditional fitting codes applied to the identical real spectra is reported, leaving open whether the quoted 0.12/0.03 dex dispersions survive unmodeled systematics such as residual sky lines, spatially varying instrumental response, or complex dust geometries.

    Authors: We agree that quantitative validation on real data would strengthen the presentation. The primary performance claims (negligible bias and quoted dispersions) are based on the held-out synthetic test set, while the NGC 3627 maps serve as an illustrative application demonstrating spatially coherent, physically plausible results. In the revised manuscript we have added a limited quantitative comparison: for a random subset of 500 spaxels we ran both the CNN and STARLIGHT, finding that the recovered luminosity-weighted mean ages and metallicities agree to within 0.15 dex and 0.05 dex rms, respectively. We also added a dedicated paragraph discussing the impact of unmodeled systematics (sky residuals, instrumental response, dust geometry) and why a perfect pixel-by-pixel match is not expected given differing modeling assumptions between the two approaches. revision: partial

  2. Referee: [Methods (training dataset and network architecture)] The training description states that the 165,000 synthetics span a broad range of SFH shapes, metallicities, dust attenuation, and SNR, but provides insufficient detail on the train/validation/test split ratios, how noise properties were matched to real MUSE/HST observations, or any post-training generalization tests on out-of-distribution synthetics. These elements are load-bearing for the claim of negligible bias across evolutionary scenarios.

    Authors: We thank the referee for highlighting these omissions. The revised Methods section now explicitly states the split ratios (70 % training, 15 % validation, 15 % test) and describes how realistic noise was injected: we used the per-spaxel variance spectra from the PHANGS-MUSE cubes and the photometric uncertainties from PHANGS-HST to generate noise realizations that match the observed SNR distributions. We have also added a new subsection reporting generalization tests on out-of-distribution synthetic spectra (including extreme bursty SFHs and metallicities outside the main training range), which show only modest degradation (age dispersion increases to 0.18 dex, metallicity to 0.05 dex) while bias remains negligible. revision: yes

Circularity Check

0 steps flagged

No significant circularity; recovery metrics computed on independent held-out synthetics

full rationale

The paper generates 165000 synthetic spectra and photometry from parametrized SFH shapes, metallicity evolution, dust curves, and noise models that are specified separately from the CNN architecture and loss. The reported negligible bias and dispersions (~0.12 dex age, ~0.03 dex metallicity) are measured by comparing CNN outputs against the known ground-truth labels of a held-out test subset of those synthetics; these statistics are therefore empirical performance measures rather than quantities defined by construction from fitted parameters or network weights. Application to real NGC 3627 data is presented only as qualitative maps of coherent structures with no quantitative self-referential benchmark that would close a loop. No load-bearing self-citations, uniqueness theorems, or ansatzes imported from prior author work are invoked to justify the central claims. The derivation from training distribution to reported accuracies therefore remains independent of the outputs themselves.

Axiom & Free-Parameter Ledger

1 free parameters · 1 axioms · 0 invented entities

The central claim depends on the assumption that synthetic training data adequately represent real observations and that the network generalizes beyond the training distribution; no new physical entities are introduced.

free parameters (1)
  • Number of age bins
    SFH is discretized into 16 fixed age bins whose boundaries are chosen by the authors rather than derived from data.
axioms (1)
  • domain assumption Synthetic spectra and photometry generated from assumed SFH shapes, metallicity evolution, and dust laws accurately reproduce the statistical properties of real PHANGS observations.
    All training and reported performance metrics rest on this coverage assumption.

pith-pipeline@v0.9.0 · 5673 in / 1420 out tokens · 67736 ms · 2026-05-15T05:07:37.264939+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

93 extracted references · 93 canonical work pages · 7 internal anchors

  1. [1]

    G., Groves, B

    Allen, M. G., Groves, B. A., Dopita, M. A., Sutherland, R. S., & Kewley, L. J. 2008, ApJS, 178, 20–55

  2. [2]

    R., La Barbera, F., et al

    Angthopo, J., Granett, B. R., La Barbera, F., et al. 2024, A&A, 690, A198

  3. [3]

    Anwar, O., Groves, B., Cortese, L., & Watts, A. B. 2025, PASA, 42, e102 Asa’d, R. S., Vazdekis, A., Cerviño, M., et al. 2017, MNRAS, 471, 3599

  4. [4]

    V ., Cid Fernandes, R., Stasi´nska, G., et al

    Asari, N. V ., Cid Fernandes, R., Stasi´nska, G., et al. 2007, MNRAS, 381, 263

  5. [5]

    1994, A&AS, 106, 275

    Bertelli, G., Bressan, A., Chiosi, C., Fagotto, F., & Nasi, E. 1994, A&AS, 106, 275

  6. [6]

    2019, A&A, 622, A103

    Boquien, M., Burgarella, D., Roehlly, Y ., et al. 2019, A&A, 622, A103

  7. [7]

    2001, Machine Learning, 45, 5

    Breiman, L. 2001, Machine Learning, 45, 5

  8. [8]

    2019, MNRAS, 483, 529

    Cabayol, L., Sevilla-Noarbe, I., Fernández, E., et al. 2019, MNRAS, 483, 529

  9. [9]

    C., et al

    Calzetti, D., Armus, L., Bohlin, R. C., et al. 2000, ApJ, 533, 682

  10. [10]

    2012, pPXF: Penalized Pixel-Fitting stellar kinematics extraction, Astrophysics Source Code Library, record ascl:1210.002

    Cappellari, M. 2012, pPXF: Penalized Pixel-Fitting stellar kinematics extraction, Astrophysics Source Code Library, record ascl:1210.002

  11. [11]

    2023, MNRAS, 526, 3273

    Cappellari, M. 2023, MNRAS, 526, 3273

  12. [12]

    A., Clayton, G

    Cardelli, J. A., Clayton, G. C., & Mathis, J. S. 1989, ApJ, 345, 245

  13. [13]

    C., McLure, R

    Carnall, A. C., McLure, R. J., Dunlop, J. S., & Davé, R. 2018, MNRAS, 480, 4379

  14. [14]

    J., Cardiel, N., Gorgas, J., et al

    Cenarro, A. J., Cardiel, N., Gorgas, J., et al. 2001, MNRAS, 326, 959

  15. [15]

    2001, ApJ, 554, 1274

    Chabrier, G. 2001, ApJ, 554, 1274

  16. [16]

    T., Thilker, D

    Chandar, R., Barnes, A. T., Thilker, D. A., et al. 2025, AJ, 169, 150

  17. [17]

    & Fall, S

    Charlot, S. & Fall, S. M. 2000, ApJ, 539, 718

  18. [18]

    J., Aragón-Salamanca, A., et al

    Cheng, T.-Y ., Conselice, C. J., Aragón-Salamanca, A., et al. 2020, MNRAS, 493, 4209 Cid Fernandes, R., Gu, Q., Melnick, J., et al. 2004, MNRAS, 355, 273 Cid Fernandes, R., Mateus, A., Sodré, L., Stasi ´nska, G., & Gomes, J. M. 2005, MNRAS, 358, 363 Cid Fernandes, R., Pérez, E., García Benito, R., et al. 2013, A&A, 557, A86 da Cunha, E., Charlot, S., & El...

  19. [19]

    A., Helou, G., Magdis, G

    Dale, D. A., Helou, G., Magdis, G. E., et al. 2014, ApJ, 784, 83

  20. [20]

    W., & Dambre, J

    Dieleman, S., Willett, K. W., & Dambre, J. 2015, MNRAS, 450, 1441 Domínguez Sánchez, H., Coelho, P., Bruzual, G., et al. 2026, A&A, 705, A219 Domínguez Sánchez, H., Margalef, B., Bernardi, M., & Huertas-Company, M. 2022, MNRAS, 509, 4024 Domínguez Sánchez, H., Martin, G., Damjanov, I., et al. 2023, MNRAS, 521, 3861

  21. [21]

    A guide to convolution arithmetic for deep learning

    Dumoulin, V . & Visin, F. 2016, A Guide to Convolution Arithmetic for Deep Learning, arXiv preprint arXiv:1603.07285

  22. [22]

    2022, A&A, 659, A191

    Emsellem, E., Schinnerer, E., Santoro, F., et al. 2022, A&A, 659, A191

  23. [23]

    A., O’Briain, T., et al

    Fabbro, S., Venn, K. A., O’Briain, T., et al. 2018, MNRAS, 475, 2978

  24. [24]

    Farahani, A., Pourshojae, B., Rasheed, K., & Arabnia, H. R. 2021, arXiv e-prints, arXiv:2104.02144

  25. [25]

    2019, Journal of Machine Learning Re- search, 20, 1

    Fisher, A., Rudin, C., & Dominici, F. 2019, Journal of Machine Learning Re- search, 20, 1

  26. [26]

    2018, MNRAS, 478, 2633

    Ge, J., Yan, R., Cappellari, M., et al. 2018, MNRAS, 478, 2633

  27. [27]

    2014, MNRAS, 445, 175

    Genel, S., V ogelsberger, M., Springel, V ., et al. 2014, MNRAS, 445, 175

  28. [28]

    2000, A&AS, 141, 371

    Girardi, L., Bressan, A., Bertelli, G., & Chiosi, C. 2000, A&AS, 141, 371

  29. [29]

    & Bengio, Y

    Glorot, X. & Bengio, Y . 2010, in Proceedings of Machine Learning Research, V ol. 9, Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, ed. Y . W. Teh & M. Titterington (Chia Laguna

  30. [30]

    2025, ApJ, submitted

    Graham, G., Dale, D., & Chase, S. 2025, ApJ, submitted

  31. [31]

    D., Silva, D., Rayner, J., et al

    Gregg, M. D., Silva, D., Rayner, J., et al. 2006, in The 2005 HST Calibration Workshop: Hubble After the Transition to Two-Gyro Mode, ed. A. M. Koeke- moer, P. Goudfrooij, & L. L. Dressel, 209

  32. [32]

    C., Lee, J

    Hannon, S., Whitmore, B. C., Lee, J. C., et al. 2023, MNRAS, 526, 2991

  33. [33]

    & Robertson, B

    Hausen, R. & Robertson, B. E. 2020, ApJS, 248, 20

  34. [34]

    Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

    He, K., Zhang, X., Ren, S., & Sun, J. 2015, arXiv e-prints, arXiv:1502.01852

  35. [35]

    F., Somerville, R

    Hopkins, P. F., Somerville, R. S., Hernquist, L., et al. 2006, ApJ, 652, 864

  36. [36]

    2015, ApJS, 221, 8

    Huertas-Company, M., Gravet, R., Cabrera-Vives, G., et al. 2015, ApJS, 221, 8

  37. [37]

    H., & Pernet, E

    Iglesias-Navarro, P., Huertas-Company, M., Martín-Navarro, I., Knapen, J. H., & Pernet, E. 2024, A&A, 689, A58

  38. [38]

    Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

    Ioffe, S. & Szegedy, C. 2015, arXiv e-prints, arXiv:1502.03167

  39. [39]

    G., Gawiser, E., Faber, S

    Iyer, K. G., Gawiser, E., Faber, S. M., et al. 2019, ApJ, 879, 116

  40. [40]

    D., Leja, J., Conroy, C., & Speagle, J

    Johnson, B. D., Leja, J., Conroy, C., & Speagle, J. S. 2021, ApJS, 254, 22

  41. [41]

    E., Ksoll, V

    Kang, D. E., Ksoll, V . F., Itrich, D., et al. 2023, A&A, 674, A175

  42. [42]

    M., Yoon, S.-J., & Yi, S

    Kaviraj, S., Rey, S.-C., Rich, R. M., Yoon, S.-J., & Yi, S. K. 2007, MNRAS, 381, L74

  43. [43]

    Khan, A., Sohail, A., Zahoora, U., & Qureshi, A. S. 2020, Artificial Intelligence Review, 53, 5455

  44. [44]

    Kingma, D. P. & Ba, J. 2014, arXiv e-prints, arXiv:1412.6980

  45. [45]

    Kotz, S., Balakrishnan, N., & Johnson, N. L. 2000, Continuous multivariate dis- tributions. vol. 1, models and applications, V ol. 1 (A Wiley-Interscience Pub- lication John Wiley and Sons)

  46. [46]

    Krizhevsky, A., Sutskever, I., & Hinton, G. E. 2017, Commun. ACM, 60, 84–90

  47. [47]

    F., Reissl, S., Klessen, R

    Ksoll, V . F., Reissl, S., Klessen, R. S., et al. 2024, A&A, 683, A246

  48. [48]

    2015, Nat, 521, 436

    LeCun, Y ., Bengio, Y ., & Hinton, G. 2015, Nat, 521, 436

  49. [49]

    1998, Proceedings of the IEEE, 86, 2278

    Lecun, Y ., Bottou, L., Bengio, Y ., & Haffner, P. 1998, Proceedings of the IEEE, 86, 2278

  50. [50]

    C., Sandstrom, K

    Lee, J. C., Sandstrom, K. M., Leroy, A. K., et al. 2023, ApJ, 944, L17

  51. [51]

    C., Whitmore, B

    Lee, J. C., Whitmore, B. C., Thilker, D. A., et al. 2022, ApJS, 258, 10

  52. [52]

    C., Johnson, B

    Leja, J., Carnall, A. C., Johnson, B. D., Conroy, C., & Speagle, J. S. 2019, ApJ, 876, 3

  53. [53]

    K., Hughes, A., Liu, D., et al

    Leroy, A. K., Hughes, A., Liu, D., et al. 2021, ApJS, 255, 19

  54. [54]

    M., Kewley, L

    Levesque, E. M., Kewley, L. J., & Larson, K. L. 2010, AJ, 139, 712

  55. [55]

    2022, IEEE Transactions on Neural Net- works and Learning Systems, 33, 6999

    Li, Z., Yang, W., Peng, S., & Liu, F. 2022, IEEE Transactions on Neural Net- works and Learning Systems, 33, 6999

  56. [56]

    L., Kawata, D., Sánchez-Blázquez, P., Ferreras, I., & Symeonidis, M

    Liew-Cain, C. L., Kawata, D., Sánchez-Blázquez, P., Ferreras, I., & Symeonidis, M. 2021, MNRAS, 502, 1355

  57. [57]

    C., Acquaviva, V ., Thomas, P

    Lovell, C. C., Acquaviva, V ., Thomas, P. A., et al. 2019, MNRAS, 490, 5503

  58. [58]

    2020, Proceedings of the National Academy of Science, 117, 7052

    Lu, L., Dao, M., Kumar, P., et al. 2020, Proceedings of the National Academy of Science, 117, 7052

  59. [59]

    2015, PASP, 127, 16

    Magris, G., Mateu, J., Mateu, C., et al. 2015, PASP, 127, 16

  60. [60]

    2024, MN- RAS, 531, 2864

    Maksymowicz-Maciata, M., Spiniello, C., Martín-Navarro, I., et al. 2024, MN- RAS, 531, 2864

  61. [61]

    A., Rosales-Ortega, F

    Marino, R. A., Rosales-Ortega, F. F., Sánchez, S. F., et al. 2013, A&A, 559, A114 Martín-Navarro, I., Vazdekis, A., Falcón-Barroso, J., et al. 2018, MNRAS, 475, 3700

  62. [62]

    C., Thilker, D

    Maschmann, D., Lee, J. C., Thilker, D. A., et al. 2024, ApJS, 273, 14

  63. [63]

    UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction

    McInnes, L., Healy, J., & Melville, J. 2018, arXiv e-prints, arXiv:1802.03426

  64. [64]

    2023, AJ, 166, 74

    Melchior, P., Liang, Y ., Hahn, C., & Goulding, A. 2023, AJ, 166, 74

  65. [65]

    2009, A&A, 507, 1793

    Noll, S., Burgarella, D., Giovannoli, E., et al. 2009, A&A, 507, 1793

  66. [66]

    F., & Jimenez, R

    Panter, B., Heavens, A. F., & Jimenez, R. 2003, MNRAS, 343, 1145

  67. [67]

    2023, A&A, 673, A147

    Pessa, I., Schinnerer, E., Sanchez-Blazquez, P., et al. 2023, A&A, 673, A147

  68. [68]

    2004, ApJ, 612, 168

    Pietrinferni, A., Cassisi, S., Salaris, M., & Castelli, F. 2004, ApJ, 612, 168

  69. [69]

    R., Tout, C

    Pols, O. R., Tout, C. A., Eggleton, P. P., & Han, Z. 1995, MNRAS, 274, 964

  70. [70]

    Robotham, A. S. G., Bellstedt, S., Lagos, C. d. P., et al. 2020, MNRAS, 495, 905

  71. [71]

    2015, A&A, 583, A60

    Ruiz-Lara, T., Pérez, I., Gallart, C., et al. 2015, A&A, 583, A60

  72. [72]

    E., Hinton, G

    Rumelhart, D. E., Hinton, G. E., & Williams, R. J. 1986, in Nat, V ol. 323 (Macmillan Education), 533–536

  73. [73]

    Saleh, R. A. & Ehsanes Saleh, A. K. M. 2022, arXiv e-prints, arXiv:2208.04564 Salvador-Rusiñol, N., Vazdekis, A., La Barbera, F., et al. 2020, Nature Astron- omy, 4, 252 Sánchez, S. F., Pérez, E., Sánchez-Blázquez, P., et al. 2016, Rev. Mexicana As- tron. Astrofis., 52, 21 Sánchez-Blázquez, P., Ocvirk, P., Gibson, B. K., Pérez, I., & Peletier, R. F. 2011,...

  74. [74]

    A., Bower, R

    Schaye, J., Crain, R. A., Bower, R. G., et al. 2015, MNRAS, 446, 521

  75. [75]

    F., Jimenez, R., & Panter, B

    Tojeiro, R., Heavens, A. F., Jimenez, R., & Panter, B. 2007, MNRAS, 381, 1252

  76. [76]

    2009, ARA&A, 47, 371

    Tolstoy, E., Hill, V ., & Tosi, M. 2009, ARA&A, 47, 371

  77. [77]

    A., Dale, D

    Turner, J. A., Dale, D. A., Lee, J. C., et al. 2021, MNRAS, 502, 1366

  78. [78]

    A., Singh, H

    Valdes, F., Gupta, R., Rose, J. A., Singh, H. P., & Bell, D. J. 2004, ApJS, 152, 251 Article number, page 19 of 25 A&A proofs:manuscript no. aa58401-25

  79. [79]

    Attention Is All You Need

    Vaswani, A., Shazeer, N., Parmar, N., et al. 2017, arXiv e-prints, arXiv:1706.03762

  80. [80]

    2016, MNRAS, 463, 3409

    Vazdekis, A., Koleva, M., Ricciardelli, E., Röck, B., & Falcón-Barroso, J. 2016, MNRAS, 463, 3409

Showing first 80 references.