pith. machine review for the scientific record. sign in

arxiv: 2605.08915 · v1 · submitted 2026-05-09 · 💻 cs.LG

Recognition: no theorem link

Physics-Informed Neural PDE Solvers via Spatio-Temporal MeanFlow

Authors on Pith no claims yet

Pith reviewed 2026-05-12 03:48 UTC · model grok-4.3

classification 💻 cs.LG
keywords physics-informed neural networksPDE solversMeanFlowneural operatorsspatio-temporal integrationcontinuous-time models
0
0 comments X

The pith

Replacing the generative velocity field with the physical PDE operator and extending the mean constraint to space and time creates a unified neural solver for both evolving and stationary equations.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

Traditional neural PDE solvers rely on pointwise residuals or fixed time grids that overlook the continuous integral character of physical systems. This paper adapts MeanFlow, an efficient continuous-time integrator from generative modeling, by swapping its velocity field for the PDE operator itself. The key extension applies the original mean constraint across both space and time rather than time alone, enforcing consistency throughout the domain. The resulting model directly predicts the physical state after any chosen interval length without repeated stepping. Experiments on standard benchmarks report higher accuracy, faster inference, and better generalization to new initial conditions and grid resolutions than representative baselines.

Core claim

By substituting the generative velocity field with the physical PDE operator, we transform multi-step numerical integration into an efficient prediction with a freely controllable integration length. Crucially, we extend the original MeanFlow constraint from the temporal to the spatio-temporal domain, coupling time evolution with spatial consistency. This yields a unified framework naturally accommodating both time-dependent and stationary PDEs.

What carries the argument

Spatio-Temporal MeanFlow, obtained by replacing the generative velocity with the PDE operator and extending the integral constraint over space and time to predict finite-interval state evolution in one step.

If this is right

  • Multi-step numerical integration reduces to a single forward pass whose length can be chosen freely at inference time.
  • The same trained model handles both time-dependent evolution and stationary problems without separate formulations.
  • The integral constraint produces strong generalization to out-of-distribution initial conditions and varying spatial resolutions.
  • Inference becomes faster than repeated-step baselines while maintaining or improving accuracy on benchmark problems.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The controllable interval length may allow error-driven adaptive stepping during simulation without retraining.
  • Conservation properties could be tested by checking whether the integral constraint better preserves invariants than residual-only losses over long horizons.
  • The continuous formulation might extend to parameter-dependent or stochastic PDEs by treating parameters as additional input dimensions.

Load-bearing premise

Substituting the PDE operator for the generative velocity field and extending the mean constraint to the full spatio-temporal domain preserves the integral properties and numerical stability of the original method for arbitrary PDEs.

What would settle it

Demonstration that the method produces unstable or inaccurate long-term trajectories on a stiff nonlinear PDE such as the Kuramoto-Sivashinsky equation when the integration interval is increased beyond the training range.

Figures

Figures reproduced from arXiv: 2605.08915 by Difan Zou, Hanru Bai, Yuncheng Zhou.

Figure 1
Figure 1. Figure 1: Accuracy-efficiency trade-off on the Burgers bench￾mark. Our Spatio-Temporal MeanFlow achieves an optimal Pareto frontier (bottom-left), delivering the lowest relative L2 error with highly competitive inference speeds across all baselines. across the continuous spatial domain from time τ to t is the integral: u(x, t) = u(x, τ ) + R t τ f[u(x, s), ∇, x, s] ds. (1) While the two dominant deep learning paradi… view at source ↗
Figure 2
Figure 2. Figure 2: Overview of our framework. (A) Spatio-Temporal Path Dynamics, formulating PDE solutions via straight paths. (B) Space-Time Decoupling, constraining the surface in space or time direction only, designed to mitigate scale mismatches within the Hessian matrix. the spatio-temporal MeanFlow function m as the integral average of the state evolution along path p: m[u(ξ, τ ), p, a] ≜ R p du(p) / R p ∥ dp∥2. (6) Fo… view at source ↗
Figure 3
Figure 3. Figure 3: Relative L2 error on Burgers with different numbers of integration steps. number of steps from 1 to 4 reduces the relative L2 error, showing that the learned operator can be effectively reused over sub-intervals and benefits from moderate temporal re￾finement. However, using too many steps degrades perfor￾mance, likely due to accumulated approximation errors from repeated composition. This suggests that ou… view at source ↗
Figure 4
Figure 4. Figure 4: Qualitative visualization on the NS dataset. We compare ground-truth vorticity fields with predictions from our method at three time steps for two test samples. coherent predictions. As shown in Tab. 1 (“Real-world”), this strategy ensures effective and stable predictions under realistic distributional shifts. Inverse Problem on Darcy Flow. Furthermore, we evaluate whether our framework can be applied to i… view at source ↗
Figure 5
Figure 5. Figure 5: Visual comparison of predicted fields and error maps on ns dataset. Training Loss Trajectories. We present the training trajectories in [PITH_FULL_IMAGE:figures/full_fig_p025_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Training trajectories for all loss components, including the total loss, the data loss, the temporal MeanFlow loss, and the spatial MeanFlow loss. Additional Coefficient Analysis. We further analyze the sensitivity of Burgers’ equation to different temporal and spatial MeanFlow loss coefficients. The coefficient pair (λT-MF, λS-MF) denotes the weights of the temporal MeanFlow loss and the spatial MeanFlow … view at source ↗
Figure 7
Figure 7. Figure 7: Additional coefficient analysis on Burgers’ equation. Left: spatial MeanFlow coefficient sweep with fixed temporal MeanFlow coefficient λT-MF = 0.01. Right: temporal MeanFlow coefficient sweep with fixed spatial MeanFlow coefficient λS-MF = 0.01. The reported metric is global relative ℓ2 error at the base resolution s = 128; lower is better. MeanFlow coefficient gives the best performance, whereas a large … view at source ↗
read the original abstract

Deep learning paradigms, such as PINNs and neural operators, have significantly advanced the solving of PDEs. However, they often struggle to capture the continuous integral nature of physical systems, relying either on pointwise residuals that ignore the integral perspective or on pre-discretized temporal grids. Drawing inspiration from MeanFlow, a continuous-time integrator recently developed to efficiently solve generative ODEs, we introduce Spatio-Temporal MeanFlow, which functions as a novel PDE solver learning the finite-interval evolution of physical states. By substituting the generative velocity field with the physical PDE operator, we transform multi-step numerical integration into an efficient prediction with a freely controllable integration length. Crucially, we extend the original MeanFlow constraint from the temporal to the spatio-temporal domain, coupling time evolution with spatial consistency. This yields a unified framework naturally accommodating both time-dependent and stationary PDEs. Comprehensive experiments on benchmarks demonstrate that our approach achieves superior accuracy and inference efficiency over representative baselines. Furthermore, the proposed integral constraint enables excellent generalization to out-of-distribution initial conditions and varying spatial resolutions.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The manuscript proposes Spatio-Temporal MeanFlow, a physics-informed neural PDE solver obtained by substituting the generative velocity field of the original MeanFlow ODE with the physical PDE operator F(u) and extending the temporal MeanFlow constraint to a spatio-temporal version. This substitution is claimed to convert multi-step numerical integration into a single efficient prediction whose integration length is freely controllable, while the spatio-temporal coupling yields a unified framework for both time-dependent and stationary PDEs. Experiments on standard benchmarks are reported to show superior accuracy and inference speed relative to PINNs and neural operators, together with strong generalization to out-of-distribution initial conditions and varying spatial resolutions.

Significance. If the substitution is shown to preserve the integral identity and numerical stability for general PDE operators, the approach would constitute a meaningful advance: it supplies an integral-constraint formulation that respects the continuous-time evolution of physical systems without requiring pre-discretized temporal grids. The controllable integration length and unified treatment of evolutionary and stationary problems could improve long-horizon accuracy and resolution flexibility in neural PDE solvers.

major comments (2)
  1. [§3] §3 (Spatio-Temporal MeanFlow construction): the central substitution of the generative velocity v by the PDE operator F(u) is asserted to preserve the MeanFlow integral identity and yield a valid finite-interval integrator, yet no theorem, derivation, or set of regularity conditions on F is supplied. For nonlinear, stiff, or only weakly Lipschitz operators the contractivity or Lipschitz assumptions implicit in the original MeanFlow analysis need not hold; without such justification the claim that the spatio-temporal extension remains consistent for arbitrary PDEs is unsupported and load-bearing for the unified-framework assertion.
  2. [§4] §4 (Experiments): the reported accuracy and generalization gains are presented without an accompanying error-growth analysis versus integration length or ablation on PDE operators that violate the regularity assumptions required by the substitution (e.g., stiff reaction-diffusion or hyperbolic systems). Table 1 and Figure 3 therefore do not yet isolate whether the observed improvements stem from the integral constraint or from other architectural choices.
minor comments (2)
  1. [Abstract, §2] The abstract and §2 should explicitly cite the original MeanFlow reference and clarify which of its integral identities are being extended.
  2. [§3] Notation for the spatio-temporal constraint (Eq. (7) or equivalent) should be introduced with a clear statement of the domain of integration and the precise form of the mean operator.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the detailed and constructive report. The two major comments identify genuine gaps in the current manuscript regarding theoretical justification and experimental validation. We address each point below and will incorporate revisions to strengthen the paper.

read point-by-point responses
  1. Referee: [§3] §3 (Spatio-Temporal MeanFlow construction): the central substitution of the generative velocity v by the PDE operator F(u) is asserted to preserve the MeanFlow integral identity and yield a valid finite-interval integrator, yet no theorem, derivation, or set of regularity conditions on F is supplied. For nonlinear, stiff, or only weakly Lipschitz operators the contractivity or Lipschitz assumptions implicit in the original MeanFlow analysis need not hold; without such justification the claim that the spatio-temporal extension remains consistent for arbitrary PDEs is unsupported and load-bearing for the unified-framework assertion.

    Authors: We agree that a formal derivation was missing. The substitution follows directly from the fundamental theorem of calculus: if u satisfies ∂u/∂t = F(u), then u(T) − u(0) = ∫_0^T F(u(t)) dt, which is precisely the MeanFlow integral identity with velocity replaced by the PDE operator. This identity holds whenever a solution exists, independent of Lipschitz constants. However, we acknowledge the manuscript provided no explicit derivation or regularity discussion. In the revision we will add a dedicated subsection in §3 that (i) derives the identity from the PDE, (ii) states the minimal assumption of local existence of solutions in appropriate Sobolev spaces, and (iii) notes that the integral constraint remains well-defined even for stiff or weakly Lipschitz operators (though uniqueness may require additional regularization). We will also clarify that the unified treatment of evolutionary and stationary problems follows by setting the time derivative to zero in the stationary case, without relying on contractivity. revision: yes

  2. Referee: [§4] §4 (Experiments): the reported accuracy and generalization gains are presented without an accompanying error-growth analysis versus integration length or ablation on PDE operators that violate the regularity assumptions required by the substitution (e.g., stiff reaction-diffusion or hyperbolic systems). Table 1 and Figure 3 therefore do not yet isolate whether the observed improvements stem from the integral constraint or from other architectural choices.

    Authors: We concur that the current experiments do not fully isolate the contribution of the integral constraint or test the method on operators that may violate strong regularity. In the revised manuscript we will add: (1) an error-growth plot showing L2 error versus controllable integration length for the Navier–Stokes and reaction-diffusion benchmarks; (2) new experiments on a stiff reaction-diffusion system (e.g., Allen–Cahn with large reaction coefficient) and a hyperbolic system (e.g., inviscid Burgers or linear wave equation); (3) an ablation that removes the spatio-temporal integral constraint while keeping the network architecture fixed, thereby isolating its effect. These additions will directly address whether gains arise from the MeanFlow-style constraint or from other design choices. revision: yes

Circularity Check

0 steps flagged

No circularity: derivation extends external MeanFlow without self-referential reduction

full rationale

The paper draws inspiration from an external MeanFlow reference for continuous-time integration and proposes a substitution of the velocity field with the PDE operator plus a spatio-temporal extension of the constraint. No quoted equations or steps in the abstract or description reduce the claimed integral properties, stability, or performance to a fitted parameter, self-definition, or self-citation chain by construction. The central construction is presented as an independent adaptation with external benchmarks, satisfying the criteria for a self-contained derivation against external references.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The central claim rests on the domain assumption that the MeanFlow integral constraint remains valid after substitution of a general PDE operator and after extension to the spatial dimensions.

axioms (1)
  • domain assumption The MeanFlow constraint can be extended to the spatio-temporal domain while preserving its integral properties when the generative velocity field is replaced by a physical PDE operator.
    Invoked when the abstract states that the extension couples time evolution with spatial consistency and yields a unified framework.

pith-pipeline@v0.9.0 · 5483 in / 1348 out tokens · 88064 ms · 2026-05-12T03:48:55.537875+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

65 extracted references · 65 canonical work pages · 7 internal anchors

  1. [1]

    2007 , publisher=

    High-dimensional partial differential equations in science and engineering , author=. 2007 , publisher=

  2. [3]

    International Journal of Applied Mathematics , volume=

    ADVANCED NUMERICAL METHODS FOR SOLVING NONLINEAR PARTIAL DIFFERENTIAL EQUATIONS IN FLUID MECHANICS: APPLICATIONS IN AEROSPACE ENGINEERING , author=. International Journal of Applied Mathematics , volume=

  3. [4]

    Proceedings of the 40th International Conference on Machine Learning (

    Consistency models , author=. Proceedings of the 40th International Conference on Machine Learning (. 2023 , month=

  4. [8]

    ACM computing surveys , volume=

    Diffusion models: A comprehensive survey of methods and applications , author=. ACM computing surveys , volume=. 2023 , publisher=

  5. [9]

    IEEE Transactions on Pattern Analysis and Machine Intelligence , volume=

    Diffusion models in vision: A survey , author=. IEEE Transactions on Pattern Analysis and Machine Intelligence , volume=. 2023 , publisher=

  6. [10]

    SIAM review , volume=

    Nonlinear and stochastic phenomena: The grand challenge for partial differential equations , author=. SIAM review , volume=. 1991 , publisher=

  7. [11]

    2009 , publisher=

    Numerical solution of partial differential equations by the finite element method , author=. 2009 , publisher=

  8. [12]

    2010 , publisher=

    Boundary element methods with applications to nonlinear problems , author=. 2010 , publisher=

  9. [13]

    Isogeometric analysis: toward integration of

    Cottrell, J Austin and Hughes, Thomas JR and Bazilevs, Yuri , year=. Isogeometric analysis: toward integration of

  10. [14]

    Journal of Scientific Computing , volume=

    Scientific machine learning through physics--informed neural networks: Where we are and what’s next , author=. Journal of Scientific Computing , volume=. 2022 , publisher=

  11. [15]

    Neurocomputing , volume=

    Improved physics-informed neural network in mitigating gradient-related failures , author=. Neurocomputing , volume=. 2025 , publisher=

  12. [17]

    Rahman, Md Ashiqur and Ross, Zachary E and Azizzadenesheli, Kamyar , journal=

  13. [19]

    When and why

    Wang, Sifan and Yu, Xinling and Perdikaris, Paris , journal=. When and why. 2022 , publisher=

  14. [20]

    SIAM Journal on Scientific Computing , volume=

    Understanding and mitigating gradient flow pathologies in physics-informed neural networks , author=. SIAM Journal on Scientific Computing , volume=. 2021 , publisher=

  15. [21]

    Advances in Neural Information Processing Systems , volume=

    Characterizing possible failure modes in physics-informed neural networks , author=. Advances in Neural Information Processing Systems , volume=

  16. [23]

    Advances in neural information processing systems , volume=

    Choose a transformer: Fourier or galerkin , author=. Advances in neural information processing systems , volume=

  17. [24]

    Fourier neural operator with learned deformations for

    Li, Zongyi and Huang, Daniel Zhengyu and Liu, Burigede and Anandkumar, Anima , journal=. Fourier neural operator with learned deformations for

  18. [25]

    Advances in Neural Information Processing Systems , volume=

    Multipole graph neural operator for parametric partial differential equations , author=. Advances in Neural Information Processing Systems , volume=

  19. [27]

    SIAM Journal on Scientific Computing , volume=

    Cocogen: Physically consistent and conditioned score-based generative models for forward and inverse problems , author=. SIAM Journal on Scientific Computing , volume=. 2025 , publisher=

  20. [28]

    2026 , url=

    Xianglong Hou and Xinquan Huang and Paris Perdikaris , booktitle=. 2026 , url=

  21. [29]

    From Numerical Models to

    Kim, Taehun and Kwon, Seulhee and Kim, Yong-Hyuk , journal=. From Numerical Models to. 2025 , publisher=

  22. [30]

    Lightweight Geometric Adaptation for Training Physics-Informed Neural Networks

    Lightweight Geometric Adaptation for Training Physics-Informed Neural Networks , author=. arXiv preprint arXiv:2604.15392 , year=

  23. [31]

    ACM/IMS Journal of Data Science , volume=

    Physics-informed neural operator for learning partial differential equations , author=. ACM/IMS Journal of Data Science , volume=. 2024 , publisher=

  24. [33]

    Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition , pages=

    Multi-task learning using uncertainty to weigh losses for scene geometry and semantics , author=. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition , pages=

  25. [35]

    Journal of Computational physics , volume=

    Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations , author=. Journal of Computational physics , volume=. 2019 , publisher=

  26. [36]

    Huang, Jiahe and Yang, Guandao and Wang, Zichen and Park, Jeong Joon , journal=

  27. [37]

    On conditional diffusion models for

    Shysheya, Aliaksandra and Diaconu, Cristiana and Bergamin, Federico and Perdikaris, Paris and Hern. On conditional diffusion models for. Advances in Neural Information Processing Systems , volume=

  28. [38]

    Flow matching meets

    Baldan, Giacomo and Liu, Qiang and Guardone, Alberto and Thuerey, Nils , journal=. Flow matching meets

  29. [39]

    Generative latent neural

    Li, Zijie and Zhou, Anthony and Farimani, Amir Barati , journal=. Generative latent neural

  30. [40]

    The Fourteenth International Conference on Learning Representations , year=

    Physics vs Distributions: Pareto Optimal Flow Matching with Physics Constraints , author=. The Fourteenth International Conference on Learning Representations , year=

  31. [43]

    Rectified flows for fast multiscale fluid flow modeling

    Armegioiu, V., Ramic, Y., and Mishra, S. Rectified flows for fast multiscale fluid flow modeling. arXiv preprint arXiv:2506.03111, 2025

  32. [44]

    Flow matching meets PDE s: A unified framework for physics-constrained generation

    Baldan, G., Liu, Q., Guardone, A., and Thuerey, N. Flow matching meets PDE s: A unified framework for physics-constrained generation. arXiv preprint arXiv:2506.08604, 2025

  33. [45]

    Physics vs distributions: Pareto optimal flow matching with physics constraints

    Baldan, G., Liu, Q., Guardone, A., and Thuerey, N. Physics vs distributions: Pareto optimal flow matching with physics constraints. In The Fourteenth International Conference on Learning Representations, 2026. URL https://openreview.net/forum?id=tAf1KI3d4X

  34. [46]

    D., Delfour, M

    Bandrauk, A. D., Delfour, M. C., and Le Bris, C. High-dimensional partial differential equations in science and engineering, volume 41. American Mathematical Soc., 2007

  35. [47]

    Choose a transformer: Fourier or galerkin

    Cao, S. Choose a transformer: Fourier or galerkin. Advances in neural information processing systems, 34: 0 24924--24940, 2021

  36. [48]

    T., and Shah, M

    Croitoru, F.-A., Hondru, V., Ionescu, R. T., and Shah, M. Diffusion models in vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45 0 (9): 0 10850--10869, 2023

  37. [49]

    S., Giampaolo, F., Rozza, G., Raissi, M., and Piccialli, F

    Cuomo, S., Di Cola, V. S., Giampaolo, F., Rozza, G., Raissi, M., and Piccialli, F. Scientific machine learning through physics--informed neural networks: Where we are and what’s next. Journal of Scientific Computing, 92 0 (3): 0 88, 2022

  38. [50]

    Mean Flows for One-step Generative Modeling

    Geng, Z., Deng, M., Bai, X., Kolter, J. Z., and He, K. Mean flows for one-step generative modeling. arXiv preprint arXiv:2505.13447, 2025

  39. [51]

    Nonlinear and stochastic phenomena: The grand challenge for partial differential equations

    Glimm, J. Nonlinear and stochastic phenomena: The grand challenge for partial differential equations. SIAM review, 33 0 (4): 0 626--643, 1991

  40. [52]

    CFO : Learning continuous-time PDE dynamics via flow-matched neural operators

    Hou, X., Huang, X., and Perdikaris, P. CFO : Learning continuous-time PDE dynamics via flow-matched neural operators. In The Fourteenth International Conference on Learning Representations, 2026. URL https://openreview.net/forum?id=IQhaeSzyup

  41. [53]

    Huang, J., Yang, G., Wang, Z., and Park, J. J. DiffusionPDE : Generative PDE -solving under partial observation. Advances in Neural Information Processing Systems, 37: 0 130291--130323, 2024

  42. [54]

    Solving partial differential equations with point source based on physics-informed neural networks

    Huang, X., Liu, H., Shi, B., Wang, Z., Yang, K., Li, Y., Weng, B., Wang, M., Chu, H., Zhou, J., et al. Solving partial differential equations with point source based on physics-informed neural networks. arXiv preprint arXiv:2111.01394, 2021

  43. [55]

    Cocogen: Physically consistent and conditioned score-based generative models for forward and inverse problems

    Jacobsen, C., Zhuang, Y., and Duraisamy, K. Cocogen: Physically consistent and conditioned score-based generative models for forward and inverse problems. SIAM Journal on Scientific Computing, 47 0 (2): 0 C399--C425, 2025

  44. [56]

    Multi-task learning using uncertainty to weigh losses for scene geometry and semantics

    Kendall, A., Gal, Y., and Cipolla, R. Multi-task learning using uncertainty to weigh losses for scene geometry and semantics. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp.\ 7482--7491, 2018

  45. [57]

    Kingma, D. P. and Ba, J. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014

  46. [58]

    Krishnapriyan, A., Gholami, A., Zhe, S., Kirby, R., and Mahoney, M. W. Characterizing possible failure modes in physics-informed neural networks. Advances in Neural Information Processing Systems, 34: 0 26548--26560, 2021

  47. [59]

    Fourier Neural Operator for Parametric Partial Differential Equations

    Li, Z., Kovachki, N., Azizzadenesheli, K., Liu, B., Bhattacharya, K., Stuart, A., and Anandkumar, A. Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020 a

  48. [60]

    Multipole graph neural operator for parametric partial differential equations

    Li, Z., Kovachki, N., Azizzadenesheli, K., Liu, B., Stuart, A., Bhattacharya, K., and Anandkumar, A. Multipole graph neural operator for parametric partial differential equations. Advances in Neural Information Processing Systems, 33: 0 6755--6766, 2020 b

  49. [61]

    Z., Liu, B., and Anandkumar, A

    Li, Z., Huang, D. Z., Liu, B., and Anandkumar, A. Fourier neural operator with learned deformations for PDE s on general geometries. Journal of Machine Learning Research, 24 0 (388): 0 1--26, 2023

  50. [62]

    Physics-informed neural operator for learning partial differential equations

    Li, Z., Zheng, H., Kovachki, N., Jin, D., Chen, H., Liu, B., Azizzadenesheli, K., and Anandkumar, A. Physics-informed neural operator for learning partial differential equations. ACM/IMS Journal of Data Science, 1 0 (3): 0 1--27, 2024

  51. [63]

    Flow Matching for Generative Modeling

    Lipman, Y., Chen, R. T., Ben-Hamu, H., Nickel, M., and Le, M. Flow matching for generative modeling. arXiv preprint arXiv:2210.02747, 2022

  52. [64]

    Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow

    Liu, X., Gong, C., and Liu, Q. Flow straight and fast: Learning to generate and transfer data with rectified flow. arXiv preprint arXiv:2209.03003, 2022

  53. [65]

    Lu, L., Jin, P., and Karniadakis, G. E. Deeponet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. arXiv preprint arXiv:1910.03193, 2019

  54. [66]

    org/abs/2208.11970

    Luo, C. Understanding diffusion models: A unified perspective. arXiv preprint arXiv:2208.11970, 2022

  55. [67]

    Advanced numerical methods for solving nonlinear partial differential equations in fluid mechanics: Applications in aerospace engineering

    Madhavi, M. Advanced numerical methods for solving nonlinear partial differential equations in fluid mechanics: Applications in aerospace engineering. International Journal of Applied Mathematics, 38 0 (3s): 0 140--152, 2025

  56. [68]

    Improved physics-informed neural network in mitigating gradient-related failures

    Niu, P., Guo, J., Chen, Y., Zhou, Y., Feng, M., and Shi, Y. Improved physics-informed neural network in mitigating gradient-related failures. Neurocomputing, 638: 0 130167, 2025

  57. [69]

    U-no: U-shaped neural operators.arXiv preprint arXiv:2204.11127, 2022

    Rahman, M. A., Ross, Z. E., and Azizzadenesheli, K. U-NO : U-shaped neural operators. arXiv preprint arXiv:2204.11127, 2022

  58. [70]

    Raissi, M., Perdikaris, P., and Karniadakis, G. E. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378: 0 686--707, 2019

  59. [71]

    Consistency models

    Song, Y., Dhariwal, P., Chen, M., and Sutskever, I. Consistency models. In Proceedings of the 40th International Conference on Machine Learning ( ICML ) , volume 202 of Proceedings of Machine Learning Research. PMLR, July 2023. URL https://proceedings.mlr.press/v202/song23i.html. Poster. Submission Number 2703

  60. [72]

    arXiv preprint arXiv:2205.02191 , year=

    Tripura, T. and Chakraborty, S. Wavelet neural operator: A neural operator for parametric partial differential equations. arXiv preprint arXiv:2205.02191, 2022

  61. [73]

    Understanding and mitigating gradient flow pathologies in physics-informed neural networks

    Wang, S., Teng, Y., and Perdikaris, P. Understanding and mitigating gradient flow pathologies in physics-informed neural networks. SIAM Journal on Scientific Computing, 43 0 (5): 0 A3055--A3081, 2021

  62. [74]

    When and why PINN s fail to train: A neural tangent kernel perspective

    Wang, S., Yu, X., and Perdikaris, P. When and why PINN s fail to train: A neural tangent kernel perspective. Journal of Computational Physics, 449: 0 110768, 2022

  63. [75]

    H., Sankaran, S., Wang, H., Pappas, G

    Wang, S., Seidman, J. H., Sankaran, S., Wang, H., Pappas, G. J., and Perdikaris, P. Cvit: Continuous vision transformer for operator learning. arXiv preprint arXiv:2405.13998, 2024

  64. [76]

    Diffusion models: A comprehensive survey of methods and applications

    Yang, L., Zhang, Z., Song, Y., Hong, S., Xu, R., Zhao, Y., Zhang, W., Cui, B., and Yang, M.-H. Diffusion models: A comprehensive survey of methods and applications. ACM computing surveys, 56 0 (4): 0 1--39, 2023

  65. [77]

    and Zou, D

    Zhang, Y. and Zou, D. Physics-informed distillation of diffusion models for pde-constrained generation. arXiv preprint arXiv:2505.22391, 2025