pith. machine review for the scientific record. sign in

arxiv: 2604.19465 · v2 · submitted 2026-04-21 · ⚛️ physics.flu-dyn · cs.AI

Recognition: unknown

A neural operator framework for data-driven discovery of stability and receptivity in physical systems

Authors on Pith no claims yet

Pith reviewed 2026-05-10 01:23 UTC · model grok-4.3

classification ⚛️ physics.flu-dyn cs.AI
keywords neural operatordata-driven stability analysisresolvent analysiseigenmodesfluid dynamicsautomatic differentiationdynamical systemsnonlinear systems
0
0 comments X

The pith

A neural network trained on data alone can compute eigenmodes and resolvent modes via automatic differentiation on its Jacobian.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper establishes a data-driven method to extract stability properties and optimal forcing responses from observational data without any governing equations. A neural network is trained to emulate the system's time evolution; automatic differentiation then supplies the Jacobian, from which eigenmodes and resolvent modes are obtained directly. This matters for nonlinear or high-dimensional systems such as chaotic models and fluid flows where equations are incomplete or unavailable, allowing identification of dominant instabilities and input-output structures even in strongly nonlinear regimes. The same emulator also supplies a nonlinear representation of the dynamics alongside the linear modal information.

Core claim

By training a neural network as a dynamics emulator and using automatic differentiation to extract its Jacobian, eigenmodes and resolvent modes can be computed directly from data. The method identifies dominant instability modes and input-output structures on canonical chaotic models and high-dimensional fluid flows, even in strongly nonlinear regimes, while also providing a nonlinear representation of system dynamics.

What carries the argument

A neural network trained as a dynamics emulator, from whose output the Jacobian is obtained by automatic differentiation to furnish eigenmodes and resolvent modes.

If this is right

  • Stability and receptivity analysis becomes feasible for systems whose governing equations are unknown or incomplete.
  • The same trained emulator supplies both nonlinear dynamics and linear modal information.
  • Dominant modes can be recovered in strongly nonlinear and high-dimensional regimes where classical linearization is difficult.
  • The approach applies to observational datasets in climate science, neuroscience, and fluid engineering without requiring model derivation.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The framework could be tested on experimental time-series data where no full model exists, to check whether predicted modes align with observed perturbation growth.
  • Embedding the emulator in a control loop would allow real-time adjustment of forcings based on the extracted resolvent information.
  • Accuracy in recovering known modes from data with added noise would quantify the method's robustness for practical measurements.

Load-bearing premise

The trained neural network must approximate the true underlying dynamics closely enough that derivatives of its outputs with respect to inputs recover meaningful stability and receptivity information about the physical system.

What would settle it

In a system with known equations, such as the Lorenz attractor or a simple shear flow, the eigenmodes or resolvent modes extracted from the neural Jacobian would fail to match those computed analytically or numerically from the equations.

Figures

Figures reproduced from arXiv: 2604.19465 by Chengyun Wang, Liwei Chen, Nils Thuerey.

Figure 1
Figure 1. Figure 1: Schematic of our data-driven algorithm. (1) Time-resolved state snapshots col￾lected from a dynamical system are used. (2) An NN emulator is trained using a rollout strategy to accurately learn the system’s evolution map. (3) The Jacobian of the trained emulator is then extracted via automatic differentiation and used to approximate the local Jacobian. This data-driven operator is the key ingredient for (4… view at source ↗
Figure 2
Figure 2. Figure 2: Time evolution of the true Lorenz system (red solid) compared with the NN emulator prediction (green dotted) from an unseen initial point (blue circle). (a) Trajectories in phase space; (b) Component-wise time evolution series. 11 [PITH_FULL_IMAGE:figures/full_fig_p011_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Comparison of the real part of the Jacobians between the NN-based operator and the operator-based ground truth. The first and second rows correspond to the linear and nonlinear scenarios, respectively. We aim to demonstrate the applicability of the NN emulator approach to both linear and nonlinear regimes even in complex number space. For this purpose, we consider two datasets derived respectively from the… view at source ↗
Figure 4
Figure 4. Figure 4: Linear stability analysis results of the linear and nonlinear Ginzburg-Landau sys￾tems. (a)(b) Eigenspectra obtained from the NN-based Jacobian (◦), DMD-based Jacobian (+) and operator-based ground truth (•). The dashed line denotes the stability boundary in the complex plane. (c)(d) The leading three eigenmodes of the NN-based Jacobians in both scenarios, where solid and dashed lines show the real part an… view at source ↗
Figure 5
Figure 5. Figure 5: Resolvent analysis results of the complex Ginzburg–Landau systems. The first and the second row correspond to the linear and nonlinear scenarios, respectively. (a)(c) Resolvent gain distribution for the first three modes with respect to the frequency. (b)(d) The first three forcing and response modes at the peak gain frequency. The thick gray lines in the background show operator-based ground-truth for com… view at source ↗
Figure 6
Figure 6. Figure 6: Comparison of the eigenspectra obtained from the NN-based Jacobian (◦), DMD￾based Jacobian (+) and operator-based ground truth (•) for the 2D channel flow system. The dashed line denotes the stability boundary in the complex plane. (a) Weakly nonlinear dataset. (b) Nonlinear dataset. 1st A mode Operator-based Weakly nonlinear scenario Nonlinear scenario 1st P mode [PITH_FULL_IMAGE:figures/full_fig_p018_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Comparison of the first-order A and P eigenmodes of the streamwise velocity field for the 2D channel flow system. The first column is the reference results obtained from the OS operator. The second and third columns are the NN-based Jacobian obtained from weakly nonlinear and nonlinear datasets. 18 [PITH_FULL_IMAGE:figures/full_fig_p018_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Resolvent analysis results of the 2D channel flow system. (a) Resolvent gain curves of the analytical operator and the NN-based resolvent operators derived from weakly nonlinear and nonlinear datasets. (b) The first-order forcing and response modes of the streamwise velocity field at the ω = 0.31. identify the two most dominant perturbation structures near the wall and centerline, i.e., the least-damped A … view at source ↗
Figure 9
Figure 9. Figure 9: Comparison of the (a) eigenspectra and (b) resolvent gain curves from the operator￾based Jacobian and NN-based Jacobian for the 3D channel flow system [PITH_FULL_IMAGE:figures/full_fig_p021_9.png] view at source ↗
Figure 10
Figure 10. Figure 10: Comparison of the first-order A and P eigenmodes of the streamwise velocity field for the 3D channel flow system. 21 [PITH_FULL_IMAGE:figures/full_fig_p021_10.png] view at source ↗
Figure 11
Figure 11. Figure 11: Comparison of the first-order forcing and response modes of the streamwise velocity field at the ω = 0.38 for the reduced 3D channel flow system. where ϕ(y) is the profile function introduced previously, and ϵ controls the magnitude of the initial perturbation. In this three-dimensional case, we only generate a strongly nonlinear dataset with ϵ = 10−1 . The original three-dimensional data correspond to a … view at source ↗
read the original abstract

Understanding how complex systems respond to perturbations, such as whether they will remain stable or what their most sensitive patterns are, is a fundamental challenge across science and engineering. Traditional stability and receptivity (resolvent) analyses are powerful but rely on known equations and linearization, limiting their use in nonlinear or poorly modeled systems. Here, we introduce a data-driven framework that automatically identifies stability properties and optimal forcing responses from observation data alone, without requiring governing equations. By training a neural network as a dynamics emulator and using automatic differentiation to extract its Jacobian, we can compute eigenmodes and resolvent modes directly from data. We demonstrate the method on both canonical chaotic models and high-dimensional fluid flows, successfully identifying dominant instability modes and input-output structures even in strongly nonlinear regimes. By leveraging a neural network-based emulator, we readily obtain a nonlinear representation of system dynamics while additionally retrieving intricate dynamical patterns that were previously difficult to resolve. This equation-free methodology establishes a broadly applicable tool for analyzing complex, high-dimensional datasets, with immediate relevance to grand challenges in fields such as climate science, neuroscience, and fluid engineering.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 2 minor

Summary. The manuscript introduces a data-driven framework for discovering stability and receptivity properties in physical systems. It trains a neural network to emulate the system's dynamics from observational data and employs automatic differentiation to extract the Jacobian matrix, enabling computation of eigenmodes for stability analysis and resolvent modes for receptivity analysis without requiring explicit governing equations. Demonstrations are provided on canonical chaotic models and high-dimensional fluid flows, showing identification of dominant modes even in nonlinear regimes.

Significance. If the extracted modes prove accurate, the method would offer a broadly applicable equation-free tool for stability and receptivity analysis in complex, high-dimensional systems where governing equations are unavailable, with relevance to fluid dynamics, climate science, and neuroscience. The combination of neural emulation for nonlinear dynamics with automatic differentiation for linear operators is a notable strength, as is the demonstration on both low-dimensional chaotic systems and fluid data.

major comments (3)
  1. Demonstrations section: The reported results on canonical models and fluid flows show only qualitative agreement with expected modes; no quantitative metrics are supplied, such as eigenvalue errors relative to analytical or linearized-operator references, eigenvalue perturbation under retraining, or comparison of resolvent gains to known values. This leaves the central claim that the NN Jacobian yields meaningful physical stability/receptivity information unverified.
  2. Method section (neural dynamics emulator and Jacobian extraction): State-prediction loss (e.g., trajectory MSE) is used to train the emulator, but no analysis, bounds, or regularization is provided to ensure that the automatically differentiated Jacobian approximates the true linearized vector field DF sufficiently closely; small state errors can produce large derivative discrepancies, especially in chaotic or high-dimensional regimes.
  3. Fluid-flow demonstrations: No direct validation is performed against the linearized Navier-Stokes operator or known resolvent spectra for the same base flows, so it is unclear whether the data-driven modes recover the physical input-output structures rather than artifacts of the emulator.
minor comments (2)
  1. The notation for the neural operator and its embedding of the dynamics emulator could be made more explicit with additional equations relating the network output to the discrete-time map.
  2. Figure captions and legends in the results section would benefit from clearer indication of which curves or fields correspond to the neural-operator modes versus reference solutions.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for the detailed and constructive review of our manuscript. We address each major comment below and indicate the revisions we will implement to improve the clarity and rigor of the work.

read point-by-point responses
  1. Referee: Demonstrations section: The reported results on canonical models and fluid flows show only qualitative agreement with expected modes; no quantitative metrics are supplied, such as eigenvalue errors relative to analytical or linearized-operator references, eigenvalue perturbation under retraining, or comparison of resolvent gains to known values. This leaves the central claim that the NN Jacobian yields meaningful physical stability/receptivity information unverified.

    Authors: We agree that quantitative validation would strengthen the central claims. In the revised manuscript we will add explicit metrics for the canonical models, including eigenvalue errors against analytical references and sensitivity of the extracted eigenvalues to retraining with different random initializations. For the fluid cases we will report comparisons of resolvent gains against literature values where such references exist. revision: yes

  2. Referee: Method section (neural dynamics emulator and Jacobian extraction): State-prediction loss (e.g., trajectory MSE) is used to train the emulator, but no analysis, bounds, or regularization is provided to ensure that the automatically differentiated Jacobian approximates the true linearized vector field DF sufficiently closely; small state errors can produce large derivative discrepancies, especially in chaotic or high-dimensional regimes.

    Authors: The referee correctly identifies a gap between the training objective and the accuracy of the extracted Jacobian. We will revise the methods section to include empirical checks that compare the automatically differentiated Jacobian against finite-difference approximations evaluated on the trained network. We will also add a brief discussion of the limitations that may arise in strongly chaotic regimes. Adding a Jacobian-regularization term during training is feasible and will be explored; however, deriving rigorous a-priori bounds on the derivative error without further assumptions on the data distribution remains outside the scope of the present framework. revision: partial

  3. Referee: Fluid-flow demonstrations: No direct validation is performed against the linearized Navier-Stokes operator or known resolvent spectra for the same base flows, so it is unclear whether the data-driven modes recover the physical input-output structures rather than artifacts of the emulator.

    Authors: The method is intended for equation-free settings where the linearized operator is unavailable by construction. Nevertheless, we will strengthen the fluid-flow section by comparing the extracted modes against well-documented resolvent and stability results from the literature for the specific base flows examined. Where computational resources permit, we will also generate reference resolvent spectra from the linearized Navier-Stokes equations for at least one canonical case to enable direct side-by-side validation. revision: yes

Circularity Check

0 steps flagged

No significant circularity in derivation chain

full rationale

The paper trains a neural network emulator on observed trajectory data to approximate system dynamics, then applies automatic differentiation to obtain the Jacobian of this emulator and computes its eigenmodes or resolvent modes via standard linear algebra. This chain does not reduce any claimed result to a fitted parameter by construction, nor does it rely on self-definitional mappings, self-citation load-bearing premises, or imported uniqueness theorems. The central output (data-driven modes) is obtained by post-processing the trained emulator rather than being statistically forced by the training loss itself. The method is self-contained against external benchmarks once the emulator is trained, with no evidence of the enumerated circularity patterns.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 1 invented entities

The central claim depends on the neural network learning an accurate dynamics emulator from data and the Jacobian of that emulator providing valid linear stability information even for nonlinear systems.

axioms (1)
  • domain assumption A neural network can be trained to emulate the dynamics of a physical system from observation data alone.
    This underpins the entire pipeline for Jacobian extraction and mode computation.
invented entities (1)
  • Neural dynamics emulator no independent evidence
    purpose: To serve as a differentiable surrogate for system evolution without explicit governing equations
    Core component introduced to enable data-driven Jacobian-based analysis.

pith-pipeline@v0.9.0 · 5497 in / 1275 out tokens · 49924 ms · 2026-05-10T01:23:00.374147+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

77 extracted references · 3 canonical work pages · 2 internal anchors

  1. [1]

    Nonlinear oscillations, dynamical systems, and bifurcations of vector fields , volume 42

    John Guckenheimer and Philip Holmes. Nonlinear oscillations, dynamical systems, and bifurcations of vector fields , volume 42. Springer Science & Business Media, 2013

  2. [2]

    A theory of modal control

    Jerome D Simon and Sanjoy K Mitter. A theory of modal control. Information and control, 13(4):316–353, 1968

  3. [3]

    Turbulence, coherent structures, dynamical systems and symmetry

    Philip Holmes. Turbulence, coherent structures, dynamical systems and symmetry. Cam- bridge university press, 2012

  4. [4]

    Dynamic mode decomposition with control

    Joshua L Proctor, Steven L Brunton, and J Nathan Kutz. Dynamic mode decomposition with control. SIAM Journal on Applied Dynamical Systems , 15(1):142–161, 2016

  5. [5]

    Model reduction for flow analysis and control

    Clarence W Rowley and Scott TM Dawson. Model reduction for flow analysis and control. Annual Review of Fluid Mechanics , 49(1):387–417, 2017

  6. [6]

    Modal analysis of fluid flows: An overview

    Kunihiko Taira, Steven L Brunton, Scott TM Dawson, Clarence W Rowley, Tim Colo- nius, Beverley J McKeon, Oliver T Schmidt, Stanislav Gordeyev, Vassilios Theofilis, and Lawrence S Ukeiley. Modal analysis of fluid flows: An overview. AIAA journal , 55(12):4013–4041, 2017

  7. [7]

    Advances in global linear instability analysis of nonparallel and three-dimensional flows

    Vassilios Theofilis. Advances in global linear instability analysis of nonparallel and three-dimensional flows. Progress in aerospace sciences, 39(4):249–315, 2003

  8. [8]

    Global linear instability

    Vassilios Theofilis. Global linear instability. Annual Review of Fluid Mechanics , 43(1):319–352, 2011. 41

  9. [9]

    Hydro- dynamic stability without eigenvalues

    Lloyd N Trefethen, Anne E Trefethen, Satish C Reddy, and Tobin A Driscoll. Hydro- dynamic stability without eigenvalues. Science, 261(5121):578–584, 1993

  10. [10]

    A critical-layer framework for turbulent pipe flow

    Beverley J McKeon and Ati S Sharma. A critical-layer framework for turbulent pipe flow. Journal of Fluid Mechanics , 658:336–382, 2010

  11. [11]

    Dynamic mode decomposition of numerical and experimental data

    Peter J Schmid. Dynamic mode decomposition of numerical and experimental data. Journal of fluid mechanics , 656:5–28, 2010

  12. [12]

    Dynamic mode decomposition: data-driven modeling of complex systems

    J Nathan Kutz, Steven L Brunton, Bingni W Brunton, and Joshua L Proctor. Dynamic mode decomposition: data-driven modeling of complex systems . SIAM, 2016

  13. [13]

    The structure of inhomogeneous turbulent flows

    John Leask Lumley. The structure of inhomogeneous turbulent flows. Atmospheric turbulence and radio wave propagation , pages 166–178, 1967

  14. [14]

    Discovering governing equa- tions from data by sparse identification of nonlinear dynamical systems

    Steven L Brunton, Joshua L Proctor, and J Nathan Kutz. Discovering governing equa- tions from data by sparse identification of nonlinear dynamical systems. Proceedings of the national academy of sciences , 113(15):3932–3937, 2016

  15. [15]

    Nonmodal stability theory

    Peter J Schmid. Nonmodal stability theory. Annu. Rev. Fluid Mech. , 39(1):129–162, 2007

  16. [16]

    From bypass transition to flow control and data-driven turbulence modeling: an input–output viewpoint

    Mihailo R Jovanović. From bypass transition to flow control and data-driven turbulence modeling: an input–output viewpoint. Annual Review of Fluid Mechanics , 53(1):311– 345, 2021

  17. [17]

    An invitation to resolvent analysis

    Laura Victoria Rolandi, Jean Hélder Marques Ribeiro, Chi-An Yeh, and Kunihiko Taira. An invitation to resolvent analysis. Theoretical and Computational Fluid Dynamics , 38(5):603–639, 2024

  18. [18]

    Spectral analysis of nonlinear flows

    Clarence W Rowley, Igor Mezić, Shervin Bagheri, Philipp Schlatter, and Dan S Hen- ningson. Spectral analysis of nonlinear flows. Journal of fluid mechanics , 641:115–127, 2009

  19. [19]

    Data-driven resolvent analysis

    Benjamin Herrmann, Peter J Baddoo, Richard Semaan, Steven L Brunton, and Bever- ley J McKeon. Data-driven resolvent analysis. Journal of Fluid Mechanics , 918:A10, 2021

  20. [20]

    Physics-informed dynamic mode decomposition

    Peter J Baddoo, Benjamin Herrmann, Beverley J McKeon, J Nathan Kutz, and Steven L Brunton. Physics-informed dynamic mode decomposition. Proceedings of the Royal Society A, 479(2271):20220576, 2023

  21. [21]

    Kernel learning for robust dynamic mode decomposition: linear and nonlinear disam- biguation optimization

    Peter J Baddoo, Benjamin Herrmann, Beverley J McKeon, and Steven L Brunton. Kernel learning for robust dynamic mode decomposition: linear and nonlinear disam- biguation optimization. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 478(2260), 2022

  22. [22]

    Hernández, Katherine Cao, Benjamin Herrmann, Steven L

    Carlos G. Hernández, Katherine Cao, Benjamin Herrmann, Steven L. Brunton, and Beverley J. McKeon. Toward data-driven resolvent analysis of nonlinear flows. CTR Annual Research Briefs , pages 33–42, 2023. 42

  23. [23]

    Data-driven discovery of intrinsic dynamics

    Daniel Floryan and Michael D Graham. Data-driven discovery of intrinsic dynamics. Nature Machine Intelligence , 4(12):1113–1120, 2022

  24. [24]

    Learning dynamical systems from data: An introduction to physics-guided deep learning

    Rose Yu and Rui Wang. Learning dynamical systems from data: An introduction to physics-guided deep learning. Proceedings of the National Academy of Sciences , 121(27):e2311808121, 2024

  25. [25]

    Recurrent flow patterns as a basis for two-dimensional turbulence: Predicting statistics from structures

    Jacob Page, Peter Norgaard, Michael P Brenner, and Rich R Kerswell. Recurrent flow patterns as a basis for two-dimensional turbulence: Predicting statistics from structures. Proceedings of the National Academy of Sciences , 121(23):e2320007121, 2024

  26. [26]

    Deep dynamical modeling and control of unsteady fluid flows

    Jeremy Morton, Antony Jameson, Mykel J Kochenderfer, and Freddie Witherden. Deep dynamical modeling and control of unsteady fluid flows. In Advances in Neural Infor- mation Processing Systems , 2018

  27. [27]

    Deep learning methods for reynolds-averaged navier–stokes simulations of airfoil flows

    Nils Thuerey, Konstantin Weißenow, Lukas Prantl, and Xiangyu Hu. Deep learning methods for reynolds-averaged navier–stokes simulations of airfoil flows. AIAA Journal, 58(1):25–36, 2020

  28. [28]

    Towards high-accuracy deep learning inference of com- pressible flows over aerofoils

    Li-Wei Chen and Nils Thuerey. Towards high-accuracy deep learning inference of com- pressible flows over aerofoils. Computers & Fluids , 250:105707, 2023

  29. [29]

    Learning data- driven discretizations for partial differential equations

    Yohai Bar-Sinai, Stephan Hoyer, Jason Hickey, and Michael P Brenner. Learning data- driven discretizations for partial differential equations. Proceedings of the National Academy of Sciences , 116(31):15344–15349, 2019

  30. [30]

    Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations

    Maziar Raissi, Paris Perdikaris, and George E Karniadakis. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics , 378:686–707, 2019

  31. [31]

    Machine learning–accelerated computational fluid dynamics

    Dmitrii Kochkov, Jamie A Smith, Ayya Alieva, Qing Wang, Michael P Brenner, and Stephan Hoyer. Machine learning–accelerated computational fluid dynamics. Proceed- ings of the National Academy of Sciences , 118(21):e2101784118, 2021

  32. [32]

    Learned turbulence modelling with differ- entiable fluid solvers: physics-based loss functions and optimisation horizons

    Björn List, Li-Wei Chen, and Nils Thuerey. Learned turbulence modelling with differ- entiable fluid solvers: physics-based loss functions and optimisation horizons. Journal of Fluid Mechanics , 949:A25, 2022

  33. [33]

    Deep learning-based predictive modeling of transonic flow over an airfoil

    Liwei Chen and Nils Thuerey. Deep learning-based predictive modeling of transonic flow over an airfoil. Physics of Fluids , 36(12), 2024

  34. [34]

    Apebench: A benchmark for autoregressive neural emulators of pdes

    Felix Koehler, Simon Niedermayr, Nils Thuerey, et al. Apebench: A benchmark for autoregressive neural emulators of pdes. Advances in Neural Information Processing Systems, 37:120252–120310, 2024

  35. [35]

    Fourier Neural Operator for Parametric Partial Differential Equations

    Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhat- tacharya, Andrew Stuart, and Anima Anandkumar. Fourier neural operator for para- metric partial differential equations. arXiv preprint arXiv:2010.08895 , 2020. 43

  36. [36]

    Towards stability of autoregressive neural operators

    Michael McCabe, Peter Harrington, Shashank Subramanian, and Jed Brown. Towards stability of autoregressive neural operators. Transactions on Machine Learning Re- search, 2023

  37. [37]

    On the mechanism of trailing vortex wandering

    Adam M Edstrand, Timothy B Davis, Peter J Schmid, Kunihiko Taira, and Louis N Cattafesta III. On the mechanism of trailing vortex wandering. Journal of Fluid Me- chanics, 801:R1, 2016

  38. [38]

    Three-dimensional floquet stability analysis of the wake of a circular cylinder

    Dwight Barkley and Ronald D Henderson. Three-dimensional floquet stability analysis of the wake of a circular cylinder. Journal of Fluid Mechanics , 322:215–241, 1996

  39. [39]

    Factorized fourier neural operators.arXiv preprint arXiv:2111.13802, 2021

    Alasdair Tran, Alexander Mathews, Lexing Xie, and Cheng Soon Ong. Factorized fourier neural operators. arXiv preprint arXiv:2111.13802 , 2021

  40. [40]

    Solver- in-the-loop: Learning from differentiable physics to interact with iterative pde-solvers

    Kiwon Um, Robert Brand, Yun Raymond Fei, Philipp Holl, and Nils Thuerey. Solver- in-the-loop: Learning from differentiable physics to interact with iterative pde-solvers. Advances in neural information processing systems , 33:6111–6122, 2020

  41. [41]

    Energy growth in viscous channel flows

    Satish C Reddy and Dan S Henningson. Energy growth in viscous channel flows. Journal of Fluid Mechanics , 252:209–238, 1993

  42. [42]

    Stability and transition in shear flows , volume

    Peter J Schmid and Dan S Henningson. Stability and transition in shear flows , volume

  43. [43]

    Springer Science & Business Media, 2012

  44. [44]

    Turbulence and the dynamics of coherent structures

    Lawrence Sirovich. Turbulence and the dynamics of coherent structures. i. coherent structures. Quarterly of applied mathematics , 45(3):561–571, 1987

  45. [45]

    Deep rein- forcement learning in a handful of trials using probabilistic dynamics models

    Kurtland Chua, Roberto Calandra, Rowan McAllister, and Sergey Levine. Deep rein- forcement learning in a handful of trials using probabilistic dynamics models. Advances in neural information processing systems , 31, 2018

  46. [46]

    Inverse design for fluid- structure interactions using graph network simulators

    Kelsey Allen, Tatiana Lopez-Guevara, Kimberly L Stachenfeld, Alvaro Sanchez Gon- zalez, Peter Battaglia, Jessica B Hamrick, and Tobias Pfaff. Inverse design for fluid- structure interactions using graph network simulators. Advances in Neural Information Processing Systems, 35:13759–13774, 2022

  47. [47]

    Data-driven science and engineering: Machine learning, dynamical systems, and control

    Steven L Brunton and J Nathan Kutz. Data-driven science and engineering: Machine learning, dynamical systems, and control . Cambridge University Press, 2022

  48. [48]

    Definition and properties of lagrangian coherent structures from finite-time lyapunov exponents in two-dimensional aperiodic flows

    Shawn C Shadden, Francois Lekien, and Jerrold E Marsden. Definition and properties of lagrangian coherent structures from finite-time lyapunov exponents in two-dimensional aperiodic flows. Physica D: Nonlinear Phenomena , 212(3-4):271–304, 2005

  49. [49]

    Input-output analysis and control design applied to a linear model of spatially developing flows

    S Bagheri, DS Henningson, J Hœpffner, and Peter J Schmid. Input-output analysis and control design applied to a linear model of spatially developing flows. Applied Mechanics Reviews, 62(2):020803, 2009

  50. [50]

    H2 optimal actuator and sensor placement in the linearised complex ginzburg–landau system

    Kevin K Chen and Clarence W Rowley. H2 optimal actuator and sensor placement in the linearised complex ginzburg–landau system. Journal of Fluid Mechanics , 681:241–260, 2011. 44

  51. [51]

    Efficient computation of global resolvent modes

    Eduardo Martini, Daniel Rodríguez, Aaron Towne, and André VG Cavalieri. Efficient computation of global resolvent modes. Journal of Fluid Mechanics , 919:A3, 2021

  52. [52]

    Non- normality and classification of amplification mechanisms in stability and resolvent anal- ysis

    Sean Symon, Kevin Rosenberg, Scott TM Dawson, and Beverley J McKeon. Non- normality and classification of amplification mechanisms in stability and resolvent anal- ysis. Physical Review Fluids , 3(5):053902, 2018

  53. [53]

    Shenfun - automating the spectral galerkin method

    Mikael Mortensen. Shenfun - automating the spectral galerkin method. In Bjorn Helge Skallerud and Helge Ingolf Andersson, editors, MekIT’17 - Ninth national conference on Computational Mechanics, pages 273–298. International Center for Numerical Methods in Engineering (CIMNE), 2017

  54. [54]

    Shenfun: High performance spectral galerkin computing platform

    Mikael Mortensen. Shenfun: High performance spectral galerkin computing platform. Journal of Open Source Software , 3(31):1071, 2018

  55. [55]

    U-net: Convolutional networks for biomedical image segmentation

    Olaf Ronneberger, Philipp Fischer, and Thomas Brox. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical image com- puting and computer-assisted intervention , pages 234–241. Springer, 2015

  56. [56]

    A numerical study of the temporal eigenvalue spectrum of the blasius boundary layer

    Leslie M Mack. A numerical study of the temporal eigenvalue spectrum of the blasius boundary layer. Journal of Fluid Mechanics , 73(3):497–520, 1976

  57. [57]

    Componentwise energy amplification in channel flows

    Mihailo R Jovanović and Bassam Bamieh. Componentwise energy amplification in channel flows. Journal of Fluid Mechanics , 534:145–183, 2005

  58. [58]

    Sparsity-promoting dy- namic mode decomposition

    Mihailo R Jovanović, Peter J Schmid, and Joseph W Nichols. Sparsity-promoting dy- namic mode decomposition. Physics of Fluids , 26(2), 2014

  59. [59]

    A dynamic mode decompo- sition approach for large and arbitrarily sampled systems

    Florimond Guéniat, Lionel Mathelin, and Luc R Pastur. A dynamic mode decompo- sition approach for large and arbitrarily sampled systems. Physics of Fluids , 27(2), 2015

  60. [60]

    Cortical stability and chaos during focal seizures: insights from inference-based modeling

    Yun Zhao, David B Grayden, Mario Boley, Yueyang Liu, Philippa J Karoly, Mark J Cook, and Levin Kuhlmann. Cortical stability and chaos during focal seizures: insights from inference-based modeling. Journal of Neural Engineering , 22(3):036021, 2025

  61. [61]

    Nonlinearly induced low- frequency variability in a midlatitude coupled ocean–atmosphere model of intermediate complexity

    E Van der A voird, H Dijkstra, J Nauw, and C Schuurmans. Nonlinearly induced low- frequency variability in a midlatitude coupled ocean–atmosphere model of intermediate complexity. Climate dynamics , 19(3):303–320, 2002

  62. [62]

    Resolvent and dynamic mode analysis of flow past a square cylinder at subcritical reynolds numbers

    Hao Yuan, Jiaqing Kou, Chuanqiang Gao, and Weiwei Zhang. Resolvent and dynamic mode analysis of flow past a square cylinder at subcritical reynolds numbers. Physics of Fluids , 35(7), 2023

  63. [63]

    On the energy transfer to small disturbances in fluid flow (part i)

    Boa-Teh Chu. On the energy transfer to small disturbances in fluid flow (part i). Acta Mechanica, 1(3):215–234, 1965

  64. [64]

    On two-dimensional temporal modes in spatially evolving open flows: the flat-plate boundary layer

    Uwe Ehrenstein and Francois Gallaire. On two-dimensional temporal modes in spatially evolving open flows: the flat-plate boundary layer. Journal of Fluid Mechanics , 536:209– 218, 2005. 45

  65. [65]

    Perturbed free shear layers

    C-M Ho and Patrick Huerre. Perturbed free shear layers. Annual review of fluid me- chanics, 16:365–424, 1984

  66. [66]

    Nonlinear global modes in hot jets

    Lutz Lesshafft, Patrick Huerre, Pierre Sagaut, and Marc Terracol. Nonlinear global modes in hot jets. Journal of Fluid Mechanics , 554:393–409, 2006

  67. [67]

    Performance of a linear robust control strategy on a nonlinear model of spatially developing flows

    Eric Lauga and Thomas R Bewley. Performance of a linear robust control strategy on a nonlinear model of spatially developing flows. Journal of Fluid Mechanics , 512:343–374, 2004

  68. [68]

    Closed- loop approaches to control of a wake flow modeled by the ginzburg–landau equation

    Kelly Cohen, Stefan Siegel, Thomas McLaughlin, Eric Gillies, and James Myatt. Closed- loop approaches to control of a wake flow modeled by the ginzburg–landau equation. Computers & Fluids , 34(8):927–949, 2005

  69. [69]

    A matlab differentiation matrix suite

    J Andre Weideman and Satish C Reddy. A matlab differentiation matrix suite. ACM transactions on mathematical software (TOMS) , 26(4):465–519, 2000

  70. [70]

    The principle of minimized iterations in the solution of the matrix eigenvalue problem

    Walter Edwin Arnoldi. The principle of minimized iterations in the solution of the matrix eigenvalue problem. Quarterly of applied mathematics , 9(1):17–29, 1951

  71. [71]

    Variations on arnoldi’s method for computing eigenelements of large unsymmetric matrices

    Yousef Saad. Variations on arnoldi’s method for computing eigenelements of large unsymmetric matrices. Linear algebra and its applications , 34:269–295, 1980

  72. [72]

    Nek5000, 05 2007

    Paul Fischer, James Lottes, and Henry Tufo. Nek5000, 05 2007

  73. [73]

    Loiseau, J.-Ch

    J.-Ch. Loiseau, J.-Ch. Robinet, S. Cherubini, and E. Leriche. Investigation of the roughness-induced transition: global stability analyses and direct numerical simulations. J. Fluid Mech. , 760:175–211, 2014

  74. [74]

    Linear analysis of the cylinder wake mean flow

    Dwight Barkley. Linear analysis of the cylinder wake mean flow. Europhysics Letters, 75(5):750, 2006

  75. [75]

    Structural sensitivity of the first instability of the cylinder wake

    Flavio Giannetti and Paolo Luchini. Structural sensitivity of the first instability of the cylinder wake. Journal of Fluid Mechanics , 581:167–197, 2007

  76. [76]

    Sensitivity analysis and passive control of cylinder flow

    Olivier Marquet, Denis Sipp, and Laurent Jacquin. Sensitivity analysis and passive control of cylinder flow. Journal of Fluid Mechanics , 615:221–252, 2008

  77. [77]

    Gaussian Error Linear Units (GELUs)

    Dan Hendrycks and Kevin Gimpel. Gaussian error linear units (gelus). arXiv preprint arXiv:1606.08415, 2016. 46