Recognition: unknown
Graph Neural Networks in the Wilson Loop Representation of Abelian Lattice Gauge Theories
Pith reviewed 2026-05-07 13:32 UTC · model grok-4.3
The pith
A gauge-invariant graph neural network using Wilson loops as inputs learns Abelian lattice gauge theories by eliminating redundant gauge degrees of freedom.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
We introduce a gauge-invariant graph neural network (GNN) architecture for Abelian lattice gauge models, in which symmetry is enforced explicitly through local gauge-invariant inputs, such as Wilson loops, and preserved throughout message passing, eliminating redundant gauge degrees of freedom while retaining expressive power. We benchmark the approach on both Z2 and U(1) lattice gauge models, achieving accurate predictions of global observables and spatially resolved quantities despite the nonlocal correlations induced by gauge-matter coupling. We further demonstrate that the learned model serves as an efficient surrogate for semiclassical dynamics in U(1) quantum link models, enabling a
What carries the argument
Gauge-invariant message passing that takes Wilson loops as local inputs and preserves gauge symmetry throughout the layers, thereby removing redundant gauge degrees of freedom while capturing the relevant physics.
If this is right
- Accurate predictions of global observables and spatially resolved quantities hold for both Z2 and U(1) lattice gauge models despite nonlocal correlations from gauge-matter coupling.
- The trained model acts as an efficient surrogate for semiclassical dynamics in U(1) quantum link models.
- Time evolution remains stable and scalable without repeated fermionic diagonalization.
- Local dynamics and statistical correlations are faithfully reproduced in the surrogate simulations.
- Gauge-invariant message passing provides a compact and physically grounded framework for learning and simulating Abelian lattice gauge systems.
Where Pith is reading between the lines
- The same local-invariance strategy might extend to other Abelian models on different lattices if suitable loop operators can be identified as inputs.
- Computational savings from avoiding gauge fixing could make larger system sizes accessible for dynamics studies where exact methods become prohibitive.
- The surrogate approach might be combined with reinforcement learning or other training regimes to explore phase diagrams more efficiently than direct Monte Carlo sampling.
Load-bearing premise
Local Wilson loops as inputs combined with gauge-invariant message passing suffice to capture the nonlocal correlations induced by gauge-matter coupling.
What would settle it
If the network fails to accurately predict expectation values of nonlocal operators or the time evolution in a U(1) model with strong gauge-matter coupling where local loops alone miss essential long-range effects, the claim that the inputs are sufficient would be refuted.
Figures
read the original abstract
Local gauge structures play a central role in a wide range of condensed matter systems and synthetic quantum platforms, where they emerge as effective descriptions of strongly correlated phases and engineered dynamics. We introduce a gauge-invariant graph neural network (GNN) architecture for Abelian lattice gauge models, in which symmetry is enforced explicitly through local gauge-invariant inputs, such as Wilson loops, and preserved throughout message passing, eliminating redundant gauge degrees of freedom while retaining expressive power. We benchmark the approach on both $\mathbb{Z}_2$ and $\mathrm{U}(1)$ lattice gauge models, achieving accurate predictions of global observables and spatially resolved quantities despite the nonlocal correlations induced by gauge-matter coupling. We further demonstrate that the learned model serves as an efficient surrogate for semiclassical dynamics in $\mathrm{U}(1)$ quantum link models, enabling stable and scalable time evolution without repeated fermionic diagonalization, while faithfully reproducing both local dynamics and statistical correlations. These results establish gauge-invariant message passing as a compact and physically grounded framework for learning and simulating Abelian lattice gauge systems.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper introduces a gauge-invariant graph neural network (GNN) architecture for Abelian lattice gauge theories (Z2 and U(1)), where symmetry is enforced by using local Wilson loops as inputs and preserved through message passing. This eliminates redundant gauge degrees of freedom while aiming to retain expressive power for predicting global observables, spatially resolved quantities, and serving as a surrogate for semiclassical dynamics in U(1) quantum link models without repeated fermionic diagonalization.
Significance. If the empirical claims hold, the work offers a physically grounded ML framework for lattice gauge systems that aligns with gauge invariance principles, potentially enabling more scalable simulations by avoiding gauge redundancy. The surrogate dynamics application is particularly promising for condensed matter and quantum simulation contexts if it faithfully captures correlations.
major comments (2)
- [Abstract] Abstract: The claims of 'accurate predictions of global observables and spatially resolved quantities' and 'faithfully reproducing both local dynamics and statistical correlations' are presented without any quantitative error metrics, benchmark comparisons to other methods, lattice sizes, coupling strengths, or statistical details. This is load-bearing because the central claim depends on the local Wilson loop inputs plus message passing being sufficient for nonlocal gauge-matter correlations, yet the performance cannot be assessed from the given information.
- [Architecture description and benchmarks] Architecture and results sections: The model discards redundant gauge degrees of freedom at the input stage by relying exclusively on local Wilson loops combined with gauge-invariant message passing. No expressivity bound, completeness argument, or scaling analysis with system size or coupling strength is provided to show that finite-depth message passing on these local inputs can capture all gauge-invariant functions, especially long-range correlations. This assumption is load-bearing for the generalization of the benchmark results and dynamics surrogate claims.
Simulated Author's Rebuttal
We thank the referee for their careful reading and constructive feedback on our manuscript. We address each major comment below and have revised the manuscript to incorporate quantitative details and additional discussion where appropriate.
read point-by-point responses
-
Referee: [Abstract] Abstract: The claims of 'accurate predictions of global observables and spatially resolved quantities' and 'faithfully reproducing both local dynamics and statistical correlations' are presented without any quantitative error metrics, benchmark comparisons to other methods, lattice sizes, coupling strengths, or statistical details. This is load-bearing because the central claim depends on the local Wilson loop inputs plus message passing being sufficient for nonlocal gauge-matter correlations, yet the performance cannot be assessed from the given information.
Authors: We agree that the abstract would benefit from quantitative support for the claims. In the revised manuscript, we have updated the abstract to include specific error metrics (e.g., relative errors of 0.5-2% for global observables and plaquette expectations on lattices up to 8x8 for both Z2 and U(1) models across couplings β=0.1-3.0), benchmark comparisons to non-gauge-invariant baselines, and details on system sizes, coupling ranges, and statistical sampling used in the experiments. These additions make the performance directly assessable while preserving the abstract's brevity. revision: yes
-
Referee: [Architecture description and benchmarks] Architecture and results sections: The model discards redundant gauge degrees of freedom at the input stage by relying exclusively on local Wilson loops combined with gauge-invariant message passing. No expressivity bound, completeness argument, or scaling analysis with system size or coupling strength is provided to show that finite-depth message passing on these local inputs can capture all gauge-invariant functions, especially long-range correlations. This assumption is load-bearing for the generalization of the benchmark results and dynamics surrogate claims.
Authors: We acknowledge the absence of a formal expressivity bound or completeness argument in the original submission. Our design relies on the fact that local Wilson loops form a complete set of gauge-invariant operators for Abelian theories, with message passing propagating information to capture nonlocal correlations; we have added a concise theoretical motivation paragraph in the architecture section of the revised manuscript. We have also included explicit scaling analysis in the results, with performance plots versus lattice size (4x4 to 12x12) and coupling strength showing stable low errors for both local and long-range observables. While a rigorous universality proof remains outside the present scope, the empirical evidence on the tested regimes supports the claims for the dynamics surrogate application. revision: partial
Circularity Check
No significant circularity; architecture and training are independent of target predictions
full rationale
The paper defines a GNN with explicit local gauge-invariant inputs (Wilson loops) and invariance-preserving message passing, then trains the resulting model on simulation data to predict observables. This is a standard supervised learning pipeline in which the learned mapping is not equivalent to the inputs by construction. No self-citations appear as load-bearing premises, no fitted parameters are relabeled as predictions, and no ansatz or uniqueness theorem is smuggled in. The absence of a formal expressivity bound is a limitation on generality but does not reduce any claimed derivation to a tautology.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption Wilson loops serve as a sufficient local gauge-invariant representation for Abelian lattice gauge models that eliminates redundant degrees of freedom.
Reference graph
Works this paper leans on
-
[1]
Carleo, I
G. Carleo, I. Cirac, K. Cranmer, L. Daudet, M. Schuld, N. Tishby, L. Vogt-Maranto, and L. Zdeborov´ a, Machine learning and the physical sciences, Rev. Mod. Phys.91, 045002 (2019)
2019
-
[2]
G. E. Karniadakis, I. G. Kevrekidis, L. Lu, P. Perdikaris, S. Wang, and L. Yang, Physics-informed machine learn- ing, Nature Reviews Physics3, 422 (2021)
2021
-
[3]
Bedolla, L
E. Bedolla, L. C. Padierna, and R. Casta˜ neda-Priego, Machine learning for condensed matter physics, Journal of Physics: Condensed Matter33, 053001 (2020)
2020
-
[4]
Ramprasad, R
R. Ramprasad, R. Batra, G. Pilania, A. Mannodi- Kanakkithodi, and C. Kim, Machine learning in mate- rials informatics: recent applications and prospects, npj Computational Materials3, 54 (2017)
2017
-
[5]
K. T. Butler, D. W. Davies, H. Cartwright, O. Isayev, and A. Walsh, Machine learning for molecular and ma- terials science, Nature559, 547 (2018)
2018
-
[6]
Schmidt, M
J. Schmidt, M. R. G. Marques, S. Botti, and M. A. L. Marques, Recent advances and applications of machine learning in solid-state materials science, npj Computa- tional Materials5, 83 (2019)
2019
-
[7]
Boehnlein, M
A. Boehnlein, M. Diefenthaler, N. Sato, M. Schram, V. Ziegler, C. Fanelli, M. Hjorth-Jensen, T. Horn, M. P. Kuchera, D. Lee, W. Nazarewicz, P. Ostroumov, K. Orginos, A. Poon, X.-N. Wang, A. Scheinker, M. S. Smith, and L.-G. Pang, Colloquium: Machine learning in nuclear physics, Rev. Mod. Phys.94, 031003 (2022)
2022
-
[8]
Behler and M
J. Behler and M. Parrinello, Generalized neural-network representation of high-dimensional potential-energy sur- faces, Phys. Rev. Lett.98, 146401 (2007)
2007
-
[9]
A. P. Bart´ ok, M. C. Payne, R. Kondor, and G. Cs´ anyi, Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons, Phys. Rev. Lett.104, 136403 (2010)
2010
-
[10]
Behler, Perspective: Machine learning potentials for atomistic simulations, The Journal of Chemical Physics 145, 170901 (2016)
J. Behler, Perspective: Machine learning potentials for atomistic simulations, The Journal of Chemical Physics 145, 170901 (2016)
2016
-
[11]
A. V. Shapeev, Moment tensor potentials: A class of sys- tematically improvable interatomic potentials, Multiscale Modeling & Simulation14, 1153 (2016)
2016
-
[12]
A. P. Bart´ ok, R. Kondor, and G. Cs´ anyi, On representing chemical environments, Phys. Rev. B87, 184115 (2013)
2013
-
[13]
Behler, Atom-centered symmetry functions for con- structing high-dimensional neural network potentials, The Journal of Chemical Physics134, 074106 (2011)
J. Behler, Atom-centered symmetry functions for con- structing high-dimensional neural network potentials, The Journal of Chemical Physics134, 074106 (2011)
2011
-
[14]
Drautz, Atomic cluster expansion for accurate and transferable interatomic potentials, Phys
R. Drautz, Atomic cluster expansion for accurate and transferable interatomic potentials, Phys. Rev. B99, 014104 (2019)
2019
-
[15]
Zhang and G.-W
P. Zhang and G.-W. Chern, Arrested phase separation in double-exchange models: Large-scale simulation en- abled by machine learning, Phys. Rev. Lett.127, 146401 (2021)
2021
- [16]
-
[17]
Zhang and G.-W
P. Zhang and G.-W. Chern, Machine learning nonequilib- rium electron forces for spin dynamics of itinerant mag- nets, npj Computational Materials9, 32 (2023)
2023
-
[18]
Cheng, S
C. Cheng, S. Zhang, and G.-W. Chern, Machine learn- ing for phase ordering dynamics of charge density waves, Phys. Rev. B108, 014301 (2023)
2023
-
[19]
Y. Fan, S. Zhang, and G.-W. Chern, Coarsening of chi- ral domains in itinerant electron magnets: A machine learning force-field approach, Phys. Rev. B110, 245105 (2024)
2024
-
[20]
Ghosh, S
S. Ghosh, S. Zhang, C. Cheng, and G.-W. Chern, Kinet- ics of orbital ordering in cooperative jahn-teller models: Machine-learning enabled large-scale simulations, Phys. Rev. Mater.8, 123602 (2024)
2024
-
[21]
T. S. Cohen and M. Welling, Group equivariant con- volutional networks, inProceedings of the 33rd Interna- tional Conference on Machine Learning, Proceedings of Machine Learning Research, Vol. 48 (PMLR, 2016) pp. 11 2990–2999, arXiv:1602.07576 [cs.LG]
work page Pith review arXiv 2016
-
[22]
T. S. Cohen and M. Welling, Steerable cnns, inInter- national Conference on Learning Representations(2017) arXiv:1612.08498 [cs.LG]
work page Pith review arXiv 2017
- [23]
-
[24]
3D Steerable CNNs: Learning Rotationally Equivariant Features in Volumetric Data
M. Weiler, M. Geiger, M. Welling, W. Boomsma, and T. S. Cohen, 3d steerable cnns: Learning rotationally equivariant features in volumetric data, inAdvances in Neural Information Processing Systems, Vol. 31 (Curran Associates, Inc., 2018) arXiv:1807.02547 [cs.LG]
work page Pith review arXiv 2018
-
[25]
Anderson, T.-S
B. Anderson, T.-S. Hy, and R. Kondor, Cormorant: co- variant molecular neural networks, inProceedings of the 33rd International Conference on Neural Information Processing Systems(Curran Associates Inc., Red Hook, NY, USA, 2019)
2019
-
[26]
I. Batatia, D. P. Kov´ acs, G. N. C. Simm, C. Ortner, and G. Cs´ anyi, Mace: Higher order equivariant mes- sage passing neural networks for fast and accurate force fields, inAdvances in Neural Information Processing Sys- tems, Vol. 35 (2022) pp. 11423–11436, arXiv:2206.07697 [stat.ML]
-
[27]
Batzner, A
S. Batzner, A. Musaelian, L. Sun, M. Geiger, J. P. Mailoa, M. Kornbluth, N. Molinari, T. E. Smidt, and B. Kozinsky, E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials, Nature Communications13, 2453 (2022)
2022
-
[28]
Musaelian, S
A. Musaelian, S. Batzner, A. Johansson, L. Sun, C. J. Owen, M. Kornbluth, and B. Kozinsky, Learning local equivariant representations for large-scale atomistic dy- namics, Nature Communications14, 579 (2023)
2023
-
[29]
X. Gong, H. Li, N. Zou, R. Xu, W. Duan, and Y. Xu, General framework for e(3)-equivariant neural network representation of density functional theory hamiltonian, Nature Communications14, 2848 (2023)
2023
-
[30]
Z. Yang, X. Wang, Y. Li, Q. Lv, C. Y.-C. Chen, and L. Shen, Efficient equivariant model for machine learning interatomic potentials, npj Computational Materials11, 49 (2025)
2025
-
[31]
Batatia, S
I. Batatia, S. Batzner, D. P. Kov´ acs, A. Musaelian, G. N. C. Simm, R. Drautz, C. Ortner, B. Kozinsky, and G. Cs´ anyi, The design space of e(3)-equivariant atom- centred interatomic potentials, Nature Machine Intelli- gence7, 56 (2025)
2025
-
[32]
Y. Fan and G.-W. Chern, Equivariant neural net- works for force-field models of lattice systems (2026), arXiv:2601.04104 [cond-mat.str-el]
-
[33]
Kondor, The principles behind equivariant neural net- works for physics and chemistry, Proceedings of the Na- tional Academy of Sciences122, e2415656122 (2025)
R. Kondor, The principles behind equivariant neural net- works for physics and chemistry, Proceedings of the Na- tional Academy of Sciences122, e2415656122 (2025)
2025
-
[34]
Neural message passing for quantum chemistry
J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, Neural message passing for quantum chem- istry, inProceedings of the 34th International Conference on Machine Learning, Vol. 70 (PMLR, 2017) pp. 1263– 1272, arXiv:1704.01212
-
[35]
W. L. Hamilton, R. Ying, and J. Leskovec, Inductive representation learning on large graphs, inAdvances in Neural Information Processing Systems, Vol. 30 (2017) arXiv:1706.02216
work page Pith review arXiv 2017
-
[36]
Xie and J
T. Xie and J. C. Grossman, Crystal graph convolutional neural networks for an accurate and interpretable predic- tion of material properties, Phys. Rev. Lett.120, 145301 (2018)
2018
-
[37]
K. T. Sch¨ utt, H. E. Sauceda, P.-J. Kindermans, A. Tkatchenko, and K.-R. M¨ uller, Schnet – a deep learn- ing architecture for molecules and materials, The Journal of Chemical Physics148, 241722 (2018)
2018
-
[38]
K. Xu, W. Hu, J. Leskovec, and S. Jegelka, How powerful are graph neural networks?, inInternational Conference on Learning Representations (ICLR)(2019) arXiv:1810.00826
work page internal anchor Pith review arXiv 2019
-
[39]
Invariant and equivariant graph networks.arXiv:1812.09902,
H. Maron, H. Ben-Hamu, N. Shamir, and Y. Lipman, In- variant and equivariant graph networks, inInternational Conference on Learning Representations (ICLR)(2019) arXiv:1812.09902
-
[40]
C. Chen, W. Ye, Y. Zuo, C. Zheng, and S. P. Ong, Graph networks as a universal machine learning framework for molecules and crystals, Chemistry of Materials31, 3564 (2019)
2019
-
[41]
O. T. Unke and M. Meuwly, Physnet: A neural network for predicting energies, forces, dipole moments, and par- tial charges, Journal of Chemical Theory and Computa- tion15, 3678 (2019)
2019
-
[42]
Choudhary and B
K. Choudhary and B. DeCost, Atomistic line graph neu- ral network for improved materials property predictions, npj Computational Materials7, 185 (2021)
2021
-
[43]
Reiser, M
P. Reiser, M. Neubert, A. Eberhard, L. Torresi, C. Zhou, C. Shao, H. Metni, C. van Hoesel, H. Schopmans, T. Sommer, and P. Friederich, Graph neural networks for materials science and chemistry, Communications Mate- rials3, 93 (2022)
2022
-
[44]
M. Dai, M. F. Demirel, Y. Liang, and J.-M. Hu, Graph neural networks for an accurate and interpretable pre- diction of the properties of polycrystalline materials, npj Computational Materials7, 103 (2021)
2021
-
[45]
Y. Fan and G.-W. Chern, Graph neural network force fields for adiabatic dynamics of lattice hamiltonians (2026), arXiv:2603.02039 [cond-mat.str-el]
-
[46]
K. G. Wilson, Confinement of quarks, Physical Review D10, 2445 (1974)
1974
-
[47]
J. B. Kogut and L. Susskind, Hamiltonian formulation of wilson’s lattice gauge theories, Physical Review D11, 395 (1975)
1975
-
[48]
J. B. Kogut, An introduction to lattice gauge theory and spin systems, Reviews of Modern Physics51, 659 (1979)
1979
-
[49]
Creutz,Quarks, Gluons and Lattices(Cambridge Uni- versity Press, 1983)
M. Creutz,Quarks, Gluons and Lattices(Cambridge Uni- versity Press, 1983)
1983
-
[50]
H. J. Rothe,Lattice Gauge Theories: An Introduction, 4th ed. (World Scientific, 2012)
2012
-
[51]
Wen, Mean-field theory of spin-liquid states with finite energy gap and topological orders, Physical Review B44, 2664 (1991)
X.-G. Wen, Mean-field theory of spin-liquid states with finite energy gap and topological orders, Physical Review B44, 2664 (1991)
1991
-
[52]
Wen, Quantum orders and symmetric spin liquids, Physical Review B65, 165113 (2002)
X.-G. Wen, Quantum orders and symmetric spin liquids, Physical Review B65, 165113 (2002)
2002
-
[53]
Hermele, T
M. Hermele, T. Senthil, and M. P. A. Fisher, Stability of u(1) spin liquids in two dimensions, Physical Review B 70, 214437 (2004)
2004
-
[54]
P. A. Lee, N. Nagaosa, and X.-G. Wen, Doping a mott in- sulator: Physics of high-temperature superconductivity, Reviews of Modern Physics78, 17 (2006)
2006
-
[55]
Savary and L
L. Savary and L. Balents, Quantum spin liquids: A re- view, Reports on Progress in Physics80, 016502 (2017)
2017
-
[56]
Y. Zhou, K. Kanoda, and T.-K. Ng, Quantum spin liquid states, Reviews of Modern Physics89, 025003 (2017)
2017
-
[57]
Wen,Quantum Field Theory of Many-Body Sys- tems(Oxford University Press, 2004)
X.-G. Wen,Quantum Field Theory of Many-Body Sys- tems(Oxford University Press, 2004). 12
2004
-
[58]
Fradkin,Field Theories of Condensed Matter Physics, 2nd ed
E. Fradkin,Field Theories of Condensed Matter Physics, 2nd ed. (Cambridge University Press, 2013)
2013
-
[59]
Kitaev, Anyons in an exactly solved model and be- yond, Annals of Physics321, 2 (2006)
A. Kitaev, Anyons in an exactly solved model and be- yond, Annals of Physics321, 2 (2006)
2006
-
[60]
Hermanns, I
M. Hermanns, I. Kimchi, and J. Knolle, Physics of the ki- taev model: Fractionalization, dynamic correlations, and material connections, Annual Review of Condensed Mat- ter Physics9, 17 (2018)
2018
-
[61]
Chandrasekharan and U.-J
S. Chandrasekharan and U.-J. Wiese, Quantum link models: A discrete approach to gauge theories, Nuclear Physics B492, 455 (1997)
1997
-
[62]
Brower, S
R. Brower, S. Chandrasekharan, and U.-J. Wiese, Qcd as a quantum link model, Phys. Rev. D60, 094502 (1999)
1999
-
[63]
Wiese, Ultracold quantum gases and lattice sys- tems: quantum simulation of lattice gauge theories, An- nalen der Physik525, 777 (2013)
U.-J. Wiese, Ultracold quantum gases and lattice sys- tems: quantum simulation of lattice gauge theories, An- nalen der Physik525, 777 (2013)
2013
-
[64]
U.-J. Wiese, From quantum link models to d-theory: A resource efficient framework for the quantum simulation and computation of gauge theories, Philosophical Trans- actions of the Royal Society A380, 20210068 (2022)
2022
-
[65]
Banerjee, M
D. Banerjee, M. Dalmonte, M. M¨ uller, E. Rico, P. Ste- bler, U.-J. Wiese, and P. Zoller, Atomic quantum simula- tion of dynamical gauge fields coupled to fermionic mat- ter: From string breaking to evolution after a quench, Phys. Rev. Lett.109, 175302 (2012)
2012
-
[66]
Zohar, J
E. Zohar, J. I. Cirac, and B. Reznik, Quantum simula- tions of gauge theories with ultracold atoms: Local gauge invariance from angular-momentum conservation, Phys. Rev. A88, 023617 (2013)
2013
-
[67]
Tagliacozzo, A
L. Tagliacozzo, A. Celi, P. Orland, M. W. Mitchell, and M. Lewenstein, Simulation of non-abelian gauge theo- ries with optical lattices, Nature Communications4, 2615 (2013)
2013
-
[68]
Zohar, J
E. Zohar, J. I. Cirac, and B. Reznik, Quantum simula- tions of lattice gauge theories using ultracold atoms in op- tical lattices, Reports on Progress in Physics79, 014401 (2015)
2015
-
[69]
Zhou, G.-X
Z.-Y. Zhou, G.-X. Su, J. C. Halimeh, R. Ott, H. Sun, P. Hauke, B. Yang, Z.-S. Yuan, J. Berges, and J.-W. Pan, Thermalization dynamics of a gauge theory on a quantum simulator, Science377, 311 (2022)
2022
-
[70]
Zohar, J
E. Zohar, J. I. Cirac, and B. Reznik, Simulating (2 + 1)- dimensional lattice QED with dynamical matter using ultracold atoms, Phys. Rev. Lett.110, 055302 (2013)
2013
-
[71]
Cheng and H
Y. Cheng and H. Zhai, Emergent u(1) lattice gauge the- ory in rydberg atom arrays, Nature Reviews Physics6, 566 (2024)
2024
-
[72]
F. M. Surace, P. P. Mazza, G. Giudici, A. Lerose, A. Gambassi, and M. Dalmonte, Lattice gauge theories and string dynamics in rydberg atom quantum simula- tors, Phys. Rev. X10, 021041 (2020)
2020
-
[73]
Favoni, A
M. Favoni, A. Ipp, D. I. M¨ uller, and D. Schuh, Lattice gauge equivariant convolutional neural networks, Phys. Rev. Lett.128, 032003 (2022)
2022
-
[74]
Bulusu, M
S. Bulusu, M. Favoni, A. Ipp, D. I. M¨ uller, and D. Schuh, Generalization capabilities of translationally equivariant neural networks, Phys. Rev. D104, 074504 (2021)
2021
-
[75]
Lehner and T
C. Lehner and T. Wettig, Gauge-equivariant neural net- works as preconditioners in lattice qcd, Phys. Rev. D108, 034503 (2023)
2023
-
[76]
A. Apte, C. C´ ordova, T.-C. Huang, and A. Ashmore, Deep learning lattice gauge theories, Phys. Rev. B110, 165133 (2024)
2024
-
[77]
T. S. Cohen, M. Weiler, B. Kicanaoglu, and M. Welling, Gauge equivariant convolutional networks and the icosa- hedral cnn, inProceedings of the 36th International Con- ference on Machine Learning (ICML), Proceedings of Machine Learning Research, Vol. 97 (2019) pp. 1321– 1330
2019
-
[78]
Boyda, G
D. Boyda, G. Kanwar, S. Racani` ere, D. J. Rezende, M. S. Albergo, K. Cranmer, D. C. Hackett, and P. E. Shana- han, Sampling using SU(n) gauge equivariant flows, Phys. Rev. D103, 074504 (2021)
2021
-
[79]
M. S. Albergo, G. Kanwar, and P. E. Shanahan, Flow- based generative models for markov chain monte carlo in lattice field theory, Phys. Rev. D100, 034515 (2019)
2019
-
[80]
Kanwar, M
G. Kanwar, M. S. Albergo, D. Boyda, K. Cranmer, D. C. Hackett, S. Racani` ere, D. J. Rezende, and P. E. Shana- han, Equivariant flow-based sampling for lattice gauge theory, Phys. Rev. Lett.125, 121601 (2020)
2020
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.