Recognition: unknown
Differentiable free energy surface: a variational approach to directly observing rare events using generative deep-learning models
Pith reviewed 2026-05-10 15:54 UTC · model grok-4.3
The pith
VaFES uses reversible collective-variable extension to variationally optimize a continuous differentiable free energy surface directly from generative models without any pre-generated data.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
By extending a coarse-grained collective variable into its reversible equivalent, VaFES constructs a latent space of intermediate representation in which the collective variables occupy a subset of dimensions. This construction preserves physical interpretability while making the system energy exactly accessible, enabling variational optimization of the free energy surface without any pre-generated simulation data. A single optimization thereby yields both a continuous differentiable free energy surface and direct one-shot sampling of rare-event configurations.
What carries the argument
The reversible extension of a coarse-grained collective variable, which constructs a latent space where system energy is exactly accessible for variational optimization of the free energy surface.
If this is right
- One optimization produces both the continuous differentiable free energy surface and one-shot samples of rare-event configurations.
- The method reproduces the exact analytical free energy surface for the bistable dimer potential.
- It identifies the native folded state of chignolin in close alignment with the experimental NMR structure.
- The framework scales to complex statistical systems while preserving interpretability and controllability of the chosen collective variables.
Where Pith is reading between the lines
- The differentiable surface could be used directly for gradient-based searches of minimum-energy paths between states.
- The one-shot sampling property might reduce the need for long equilibrium simulations when estimating transition rates in high-dimensional systems.
- Because the latent space is built from arbitrary collective variables, the approach could be tested on systems where standard reaction coordinates are difficult to define in advance.
Load-bearing premise
Extending any coarse-grained collective variable into a reversible equivalent produces a latent space in which the system energy can be accessed exactly without pre-generated data.
What would settle it
Running VaFES on the bistable dimer potential and finding that the optimized surface deviates from the known analytical free energy surface.
Figures
read the original abstract
Rare events are central to the evolution of complex many-body systems, characterized as key transitional configurations on the free energy surface (FES). Conventional methods require adequate sampling of rare event transitions to obtain the FES, which is computationally very demanding. Here we introduce the variational free energy surface (VaFES), a dataset-free framework that directly models FESs using tractable-density generative models. Rare events can then be immediately identified from the FES with their configurations generated directly via one-shot sampling of generative models. By extending a coarse-grained collective variable (CV) into its reversible equivalent, VaFES constructs a latent space of intermediate representation in which the CVs explicitly occupy a subset of dimensions. This latent-space construction preserves the physical interpretability and transparent controllability of the CVs by design, while accommodating arbitrary CV formulations. The reversibility makes the system energy exactly accessible, enabling variational optimization of the FES without pre-generated simulation data. A single optimization yields a continuous, differentiable FES together with one-shot generation of rare-event configurations. Our method can reproduce the exact analytical solution for the bistable dimer potential and identify a chignolin native folded state in close alignment with the experimental NMR structure. Our approach thus establishes a scalable, systematic framework for advancing the study of complex statistical systems.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The manuscript introduces VaFES, a dataset-free variational framework that employs tractable-density generative models to directly construct a continuous, differentiable free energy surface (FES). A coarse-grained collective variable is extended into a reversible latent-space representation that preserves interpretability and allows exact evaluation of the physical energy H(x) on generated samples. Variational minimization of the KL divergence between the model density and the Boltzmann distribution is performed without pre-generated trajectories, yielding both the FES and one-shot sampling of rare-event configurations. The method is reported to recover the exact analytical FES of a bistable dimer potential and to identify a chignolin native state consistent with NMR data.
Significance. If the central variational procedure is shown to converge reliably, the work would constitute a notable advance in computational statistical mechanics by eliminating the need for expensive rare-event sampling while supplying a differentiable FES and direct configurational generation. The reversible CV extension supplies both physical transparency and controllability, features that are rarely combined in generative-model approaches to many-body systems. The dataset-free character and one-shot generation capability are particularly attractive for scaling to larger systems where conventional enhanced-sampling methods become prohibitive.
major comments (3)
- [Abstract] Abstract and the bistable-dimer results: the claim of exact recovery of the analytical FES is stated without quantitative error metrics, derivation of the variational objective, or ablation on initialization and optimizer choices; because this is the primary numerical verification of the central claim that the procedure yields the true FES, the absence of these details leaves the soundness of the optimization unverified.
- [Variational optimization and latent space construction] Description of the latent-space construction and variational optimization: reversibility grants exact access to H(x) via the change-of-variables formula, yet the loss gradients are computed from samples drawn from an initial base distribution (typically isotropic Gaussian) whose overlap with high-barrier rare-event regions is exponentially small; no auxiliary exploration mechanism (e.g., tempering, mode-seeking regularizers, or curriculum) is supplied, so the optimization has no signal to populate secondary basins and may collapse to the dominant mode, undermining the rare-event applicability asserted in the abstract.
- [Chignolin results] Chignolin application: the reported alignment of the generated native state with the experimental NMR structure is presented without error bars on the structural RMSD, sensitivity analysis to the chosen coarse-grained CV, or comparison against a standard MD reference trajectory; these omissions make it impossible to assess whether the variational FES has correctly captured the relevant rare-event basin or merely reproduced a plausible low-energy configuration.
minor comments (2)
- [Methods] The notation for the reversible extension of the collective variable into the latent dimensions should be introduced with an explicit equation linking the Jacobian determinant to the density evaluation.
- [Figures] Figure captions for the bistable potential and chignolin FES projections would benefit from stating the precise CV definitions and the number of independent optimization runs performed.
Simulated Author's Rebuttal
We thank the referee for their thorough and insightful comments, which have helped us improve the clarity and rigor of our manuscript. We provide point-by-point responses to the major comments below.
read point-by-point responses
-
Referee: [Abstract] Abstract and the bistable-dimer results: the claim of exact recovery of the analytical FES is stated without quantitative error metrics, derivation of the variational objective, or ablation on initialization and optimizer choices; because this is the primary numerical verification of the central claim that the procedure yields the true FES, the absence of these details leaves the soundness of the optimization unverified.
Authors: We agree that quantitative error metrics and additional details would strengthen the verification of the central claim. In the revised manuscript we will report the root-mean-square deviation between the recovered FES and the exact analytical FES for the bistable dimer, include a concise derivation of the variational KL objective in the Methods section, and add a short ablation on initialization and optimizer choices demonstrating convergence behavior. revision: yes
-
Referee: [Variational optimization and latent space construction] Description of the latent-space construction and variational optimization: reversibility grants exact access to H(x) via the change-of-variables formula, yet the loss gradients are computed from samples drawn from an initial base distribution (typically isotropic Gaussian) whose overlap with high-barrier rare-event regions is exponentially small; no auxiliary exploration mechanism (e.g., tempering, mode-seeking regularizers, or curriculum) is supplied, so the optimization has no signal to populate secondary basins and may collapse to the dominant mode, undermining the rare-event applicability asserted in the abstract.
Authors: The referee correctly identifies a potential practical difficulty in variational inference for multimodal targets. Because the reversible latent-space construction supplies the exact physical energy H(x) for every generated sample, the KL objective supplies a global gradient signal that penalizes under-representation of any mode according to its Boltzmann weight; this is distinct from standard generative-model training that lacks direct energy access. In the bistable-dimer experiments the optimization consistently recovered both basins from isotropic-Gaussian initialization. We will add an explicit discussion of this mechanism and the conditions under which convergence occurs, but we do not introduce auxiliary exploration techniques in the present work. revision: partial
-
Referee: [Chignolin results] Chignolin application: the reported alignment of the generated native state with the experimental NMR structure is presented without error bars on the structural RMSD, sensitivity analysis to the chosen coarse-grained CV, or comparison against a standard MD reference trajectory; these omissions make it impossible to assess whether the variational FES has correctly captured the relevant rare-event basin or merely reproduced a plausible low-energy configuration.
Authors: We acknowledge these omissions limit the strength of the chignolin validation. In the revised manuscript we will report RMSD values with standard deviations obtained from multiple independent optimizations, include a sensitivity analysis with respect to the coarse-grained CV definition, and add a direct structural comparison against configurations sampled from a conventional MD trajectory of chignolin. revision: yes
Circularity Check
VaFES variational optimization is self-contained; no reduction to inputs by construction
full rationale
The paper introduces a variational inference procedure that minimizes KL(q||p_Boltzmann) using a generative model whose density is tractable by change-of-variables after reversible CV extension. The physical Hamiltonian H(x) is supplied externally and evaluated on generated samples; the resulting FES is the optimized model output, not a redefinition of the input distribution. Validation reproduces an independent analytical bistable solution and matches an external NMR structure, confirming the procedure is falsifiable against benchmarks outside the fitted parameters. No self-citations are load-bearing, no ansatz is smuggled, and no fitted quantity is relabeled as a prediction. The derivation chain therefore remains non-circular.
Axiom & Free-Parameter Ledger
free parameters (1)
- generative model parameters
axioms (1)
- domain assumption Extending a coarse-grained collective variable into its reversible equivalent preserves physical interpretability and makes the system energy exactly accessible in the latent space.
Reference graph
Works this paper leans on
-
[1]
The lower pathway in Fig
This pathway has a relatively high free energy barrier, suggesting that it is unlikely to be the dominant folding route of chignolin. The lower pathway in Fig. 5(b), which is more likely, has a smaller barrier and proceeds in three stages. Folding first involves hydrophobic assembly of the two aromatic rings together with the formation of a nascent turn, ...
-
[2]
Torrie and J
G. Torrie and J. Valleau, Nonphysical sampling distributions in monte carlo free-energy esti- mation: Umbrella sampling, Journal of Computational Physics23, 187 (1977)
1977
-
[3]
Carter, G
E. Carter, G. Ciccotti, J. T. Hynes, and R. Kapral, Constrained reaction coordinate dynamics for the simulation of rare events, Chemical Physics Letters156, 472 (1989). 20
1989
-
[4]
Sprik and G
M. Sprik and G. Ciccotti, Free energy from constrained molecular dynamics, The Journal of Chemical Physics109, 7737 (1998), https://pubs.aip.org/aip/jcp/article- pdf/109/18/7737/19114804/7737_1_online.pdf
1998
-
[5]
Wang and D
F. Wang and D. P. Landau, Efficient, multiple-range random walk algorithm to calculate the density of states, Phys. Rev. Lett.86, 2050 (2001)
2050
-
[6]
Escaping free-energy minima , volume =
A. Laio and M. Parrinello, Escaping free-energy minima, Proceedings of the National Academy of Sciences99, 12562 (2002), https://www.pnas.org/doi/pdf/10.1073/pnas.202427399
-
[7]
A. Barducci, M. Bonomi, and M. Parrinello, Metadynamics, WIREs Computational Molecular Science1, 826 (2011), https://wires.onlinelibrary.wiley.com/doi/pdf/10.1002/wcms.31
-
[8]
Laio and F
A. Laio and F. L. Gervasio, Metadynamics: a method to simulate rare events and reconstruct the free energy in biophysics, chemistry and material science, Reports on Progress in Physics 71, 126601 (2008)
2008
-
[9]
M. I. Jordan, Z. Ghahramani, T. S. Jaakkola, and L. K. Saul, An introduction to variational methods for graphical models, Machine Learning37, 183 (1999)
1999
-
[10]
D. P. Kingma and P. Dhariwal, Glow: Generative flow with invertible 1x1 convolutions, in Advances in Neural Information Processing Systems, Vol. 31, edited by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Curran Associates, Inc., 2018)
2018
-
[11]
L. Dinh, J. Sohl-Dickstein, and S. Bengio, Density estimation using real nvp (2017), arXiv:1605.08803 [cs.LG]
work page internal anchor Pith review arXiv 2017
-
[12]
van den Oord, N
A. van den Oord, N. Kalchbrenner, and K. Kavukcuoglu, Pixel recurrent neural networks, inProceedings of The 33rd International Conference on Machine Learning, Proceedings of Machine Learning Research, Vol. 48, edited by M. F. Balcan and K. Q. Weinberger (PMLR, New York, New York, USA, 2016) pp. 1747–1756
2016
-
[13]
F. Noé, S. Olsson, J. Köhler, and H. Wu, Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning, Science365, eaaw1147 (2019)
2019
- [14]
-
[15]
D. Wu, L. Wang, and P. Zhang, Solving statistical mechanics using variational autoregressive networks, Phys. Rev. Lett.122, 080602 (2019)
2019
- [16]
-
[17]
Li, Y.-W
S.-H. Li, Y.-W. Zhang, and D. Pan, Deep generative modeling of the canonical ensemble with differentiable thermal properties, Phys. Rev. Lett. , (2025)
2025
-
[18]
Wirnsberger, G
P. Wirnsberger, G. Papamakarios, B. Ibarz, S. Racanière, A. J. Ballard, A. Pritzel, and C. Blundell, Normalizing flows for atomic solids, Machine Learning: Science and Technology 3, 025009 (2022)
2022
-
[19]
Ahmad and W
R. Ahmad and W. Cai, Free energy calculation of crystalline solids using normalizing flows, Modelling and Simulation in Materials Science and Engineering30, 065007 (2022)
2022
- [20]
-
[21]
Xie, Z.-H
H. Xie, Z.-H. Li, H. Wang, L. Zhang, and L. Wang, Deep variational free energy approach to dense hydrogen, Physical Review Letters131, 126501 (2023)
2023
-
[22]
Zhang, J
L. Zhang, J. Han, H. Wang, R. Car, and W. E, Deep potential molecular dynamics: A scalable model with the accuracy of quantum mechanics, Phys. Rev. Lett.120, 143001 (2018)
2018
-
[23]
J. Ho, A. Jain, and P. Abbeel, Denoising diffusion probabilistic models, Advances in neural information processing systems33, 6840 (2020)
2020
-
[24]
Radford, K
A. Radford, K. Narasimhan, T. Salimans, and I. Sutskever, Improving language understanding by generative pre-training, OpenAI Blog3(2018)
2018
-
[25]
R. K. Cersonsky, B. Cheng, M. De Vivo, and P. Tiwary, Machine learning and statistical mechanics: Shared synergies for next generation of chemical theory and computation, Journal of Chemical Theory and Computation21, 5359 (2025)
2025
-
[26]
E. C. Goonetilleke, B. Liu, Y. Wu, M. S. O’Connor, and X. Huang, A practical guide to transi- tion state analysis in biomolecular simulations with ts-dar, The Journal of Physical Chemistry B129, 12133 (2025)
2025
-
[27]
P. Tiwary, L. Herron, R. John, S. Lee, D. Sanwal, and R. Wang, Genera- tive ai for computational chemistry: A roadmap to predicting emergent phenom- ena, Proceedings of the National Academy of Sciences122, e2415655121 (2025), https://www.pnas.org/doi/pdf/10.1073/pnas.2415655121
- [28]
-
[29]
E. R. Beyerle and P. Tiwary, Inferring the isotropic-nematic phase transition with generative machine learning, Phys. Rev. Lett.135, 068102 (2025). 22
2025
-
[30]
T.-S. Lee, B. K. Radak, A. Pabis, and D. M. York, A new maximum likelihood approach for free energy profile construction from molecular simulations, Journal of Chemical Theory and Computation9, 153 (2013)
2013
-
[31]
Valsson and M
O. Valsson and M. Parrinello, Variational approach to enhanced sampling and free energy calculations, Phys. Rev. Lett.113, 090601 (2014)
2014
-
[32]
M. A. Nielsen and I. L. Chuang,Quantum computation and quantum information(Cambridge university press, 2010)
2010
-
[33]
Schebek, M
M. Schebek, M. Invernizzi, F. Noé, and J. Rogal, Efficient mapping of phase diagrams with con- ditional Boltzmann generators, Machine Learning: Science and Technology5, 045045 (2024)
2024
-
[34]
Falkner, A
S. Falkner, A. Coretti, S. Romano, P. L. Geissler, and C. Dellago, Conditioning Boltzmann generators for rare event sampling, Machine Learning: Science and Technology4, 035050 (2023)
2023
-
[35]
Y. Xie, L. Winkler, L. Sun, S. Lewis, A. E. Foster, J. Jimenez-Luna, T. Hempel, M. Gastegger, Y. Chen, I. Zaporozhets, C. Clementi, C. M. Bishop, and F. Noé, Enhanced diffusion sampling: efficient rare event sampling and free energy calculation with diffusion models, arXiv preprint arXiv:2602.16634 (2026)
-
[36]
JÓNSSON, G
H. JÓNSSON, G. MILLS, and K. W. JACOBSEN, Nudged elastic band method for finding minimum energy paths of transitions, inClassical and Quantum Dynamics in Condensed Phase Simulations, pp. 385–404
-
[37]
Mills, H
G. Mills, H. Jónsson, and G. K. Schenter, Reversible work transition state theory: application to dissociative adsorption of hydrogen, Surface Science324, 305 (1995)
1995
-
[38]
W. E, W. Ren, and E. Vanden-Eijnden, String method for the study of rare events, Phys. Rev. B66, 052301 (2002)
2002
-
[39]
D. P. Kingma and M. Welling, Auto-encoding variational bayes (2022), arXiv:1312.6114 [stat.ML]
work page internal anchor Pith review Pith/arXiv arXiv 2022
-
[40]
I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, Generative adversarial networks (2014), arXiv:1406.2661 [stat.ML]
work page internal anchor Pith review arXiv 2014
-
[41]
Y. Li, H. Yi, C. Bender, S. Shan, and J. B. Oliva, Exchangeable neural ode for set modeling, Advances in Neural Information Processing Systems33, 6936 (2020)
2020
-
[42]
Biloš and S
M. Biloš and S. Günnemann, Scalable normalizing flows for permutation invariant densities, in Proceedings of the 38th International Conference on Machine Learning,ProceedingsofMachine 23 Learning Research, Vol. 139, edited by M. Meila and T. Zhang (PMLR, 2021) pp. 957–967
2021
-
[43]
V. G. Satorras, E. Hoogeboom, and M. Welling, E (n) equivariant graph neural networks, in International conference on machine learning(PMLR, 2021) pp. 9323–9332
2021
-
[44]
Winter, M
R. Winter, M. Bertolini, T. Le, F. Noe, and D.-A. Clevert, Unsupervised learning of group invariant and equivariant representations, Advances in Neural Information Processing Systems 35, 31942 (2022)
2022
-
[45]
C. H. Bennett, Logical reversibility of computation, IBM Journal of Research and Development 17, 525 (1973)
1973
-
[46]
K. S. Perumalla,Introduction to reversible computing(CRC Press, 2013)
2013
-
[47]
Kullback and R
S. Kullback and R. A. Leibler, On information and sufficiency, The annals of mathematical statistics22, 79 (1951)
1951
- [48]
-
[49]
Zheng, J
S. Zheng, J. He, C. Liu, Y. Shi, Z. Lu, W. Feng, F. Ju, J. Wang, J. Zhu, Y. Min, H. Zhang, S. Tang, H. Hao, P. Jin, C. Chen, F. Noé, H. Liu, and T.-Y. Liu, Predicting equilibrium distributions for molecular systems with deep learning, Nature Machine Intelligence6, 558 (2024)
2024
-
[50]
Lewis, T
S. Lewis, T. Hempel, J. Jimenez-Luna, M. Gastegger, Y. Xie, A. Y. K. Foong, V. Garcia Sator- ras, O. Abdin, B. S. Veeling, I. Zaporozhets, Y. Chen, S. Yang, A. Schneuing, J. Nigam, F. Bar- bero, V. Stimper, A. Campbell, J. Yim, M. Lienen, Y. Shi, S. Zheng, H. Schulz, U. Munir, C. Clementi, and F. Noé, Scalable emulation of protein equilibrium ensembles wi...
2025
-
[51]
Abramson, J
J. Abramson, J. Adler, J. Dunger, R. Evans, T. Green, A. Pritzel, O. Ronneberger, L. Will- more, A. J. Ballard, J. Bambrick, S. W. Bodenstein, D. A. Evans, C.-C. Hung, M. O’Neill, D. Reiman, K. Tunyasuvunakool, Z. Wu, A. Žemgulyt˙ e, E. Arvaniti, C. Beattie, O. Bertolli, A. Bridgland, A. Cherepanov, M. Congreve, A. I. Cowen-Rivers, A. Cowie, M. Figurnov, ...
2024
- [52]
-
[53]
J. P. Nilmeier, G. E. Crooks, D. D. L. Minh, and J. D. Chodera, Nonequilibrium candidate monte carlo is an efficient tool for equilibrium simulation, Proceedings of the National Academy of Sciences108, E1009 (2011), https://www.pnas.org/doi/pdf/10.1073/pnas.1106094108
-
[54]
D.-Y. Hwang and A. M. Mebel, Reaction mechanism of n2/h2 conversion to nh3: A theoretical study, The Journal of Physical Chemistry A107, 2865 (2003), https://doi.org/10.1021/jp0270349
-
[55]
A. M. Sand, C. A. Schwerdtfeger, and D. A. Mazziotti, Strongly correlated barriers to rotation from parametric two-electron reduced-density-matrix methods in application to the isomeriza- tion of diazene, The Journal of Chemical Physics136, 034112 (2012)
2012
-
[56]
Behrmann, W
J. Behrmann, W. Grathwohl, R. T. Q. Chen, D. Duvenaud, and J.-H. Jacobsen, Invertible residual networks, inProceedings of the 36th International Conference on Machine Learning, Proceedings of Machine Learning Research, Vol. 97, edited by K. Chaudhuri and R. Salakhut- dinov (PMLR, 2019) pp. 573–582
2019
-
[57]
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
W. Grathwohl, R. T. Q. Chen, J. Bettencourt, I. Sutskever, and D. Duvenaud, Ffjord: Free- form continuous dynamics for scalable reversible generative models (2018), arXiv:1810.01367 [cs.LG]
work page Pith review arXiv 2018
-
[58]
R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, Neural ordinary differential equations (2019), arXiv:1806.07366 [cs.LG]
work page internal anchor Pith review arXiv 2019
-
[59]
Molgedey and H
L. Molgedey and H. G. Schuster, Separation of a mixture of independent signals using time delayed correlations, Phys. Rev. Lett.72, 3634 (1994)
1994
-
[60]
Pérez-Hernández, F
G. Pérez-Hernández, F. Paul, T. Giorgino, G. De Fabritiis, and F. Noé, Identification of slow molecular order parameters for markov model construction, The Journal of Chemical Physics 139, 015102 (2013)
2013
-
[61]
Satoh, K
D. Satoh, K. Shimizu, S. Nakamura, and T. Terada, Folding free-energy landscape of a 10- residue mini-protein, chignolin, FEBS letters580, 3422 (2006)
2006
-
[62]
Maruyama, S
Y. Maruyama, S. Koroku, M. Imai, K. Takeuchi, and A. Mitsutake, Mutation-induced change in chignolin stability fromπ-turn toα-turn, RSC advances10, 22797 (2020)
2020
-
[63]
A.-L. M. Fischer, A. Tichy, J. Kokot, V. J. Hoerschinger, R. F. Wild, J. R. Riccabona, J. R. Lo- effler, F.Waibl, P.K.Quoika, P.Gschwandtner, S.Forli, A.B.Ward, K.R.Liedl, M.Zacharias, 25 and M. L. Fernández-Quintero, The role of force fields and water models in protein folding and unfolding dynamics, Journal of Chemical Theory and Computation20, 2321 (2024)
2024
-
[64]
Kunzmann, J
P. Kunzmann, J. M. Anter, and K. Hamacher, Adding hydrogen atoms to molecular models via fragment superimposition, Algorithms for Molecular Biology17, 7 (2022)
2022
-
[65]
P. Kang, E. Trizio, and M. Parrinello, Computing the committor with the committor to study the transition state ensemble, Nature Computational Science4, 451 (2024)
2024
-
[66]
See the SI for details about (a). the force field information, the deep-learning model struc- ture, and extra parameters used in each applications; (b) the detailed scheme of bijective transformations used in each applications. The SI cites [16, 47, 66, 87–89]
-
[67]
K. He, X. Zhang, S. Ren, and J. Sun, Deep residual learning for image recognition, inProceed- ings of the IEEE conference on computer vision and pattern recognition(2016) pp. 770–778
2016
-
[68]
R.HaradaandA.Kitao,Exploringthefoldingfreeenergylandscapeofaβ-hairpinminiprotein, chignolin, using multiscale free energy landscape calculation method, The Journal of Physical Chemistry B115, 8806 (2011)
2011
-
[69]
Suenaga, T
A. Suenaga, T. Narumi, N. Futatsugi, R. Yanai, Y. Ohno, N. Okimoto, and M. Taiji, Folding dynamicsof10-residueβ-hairpinpeptidechignolin,Chemistry–AnAsianJournal2,591(2007)
2007
-
[70]
I. A. Hubner, E. J. Deeds, and E. I. Shakhnovich, Understanding ensemble protein folding at atomic detail, Proceedings of the National Academy of Sciences103, 17747 (2006)
2006
-
[71]
Lindorff-Larsen, S
K. Lindorff-Larsen, S. Piana, R. O. Dror, and D. E. Shaw, How fast-folding proteins fold, Science334, 517 (2011)
2011
-
[72]
A. N. Al-Rabadi,Reversible logic synthesis: from fundamentals to quantum computing (Springer Science & Business Media, 2012)
2012
-
[73]
B. Liu, J. G. Boysen, I. C. Unarta, X. Du, Y. Li, and X. Huang, Exploring transition states of protein conformational changes via out-of-distribution detection in the hyperspherical latent space, Nature Communications16, 349 (2025)
2025
-
[74]
B. Liu, S. Cao, J. G. Boysen, M. Xue, and X. Huang, Memory kernel minimization-based neural networks for discovering slow collective variables of biomolecular dynamics, Nature Computational Science5, 562 (2025)
2025
-
[75]
B. Liu, M. Xue, Y. Qiu, K. A. Konovalov, M. S. O’Connor, and X. Huang, Graphvampnets for uncovering slow collective variables of self-assembly dynamics, The Journal of Chemical Physics159(2023). 26
2023
- [76]
-
[77]
Klein, A
L. Klein, A. Krämer, and F. Noé, Equivariant flow matching, Advances in Neural Information Processing Systems36, 59886 (2023)
2023
-
[78]
Kanwar, M
G. Kanwar, M. S. Albergo, D. Boyda, K. Cranmer, D. C. Hackett, S. Racaniere, D. J. Rezende, and P. E. Shanahan, Equivariant flow-based sampling for lattice gauge theory, Physical Review Letters125, 121601 (2020)
2020
-
[79]
NIPS 2016 Tutorial: Generative Adversarial Networks
I. Goodfellow, Nips 2016 tutorial: Generative adversarial networks (2017), arXiv:1701.00160 [cs.LG]
work page Pith review arXiv 2016
-
[80]
Goodfellow, Y
I. Goodfellow, Y. Bengio, and A. Courville,Deep Learning(MIT Press, 2016)http://www. deeplearningbook.org
2016
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.