Recognition: unknown
Physics-Informed Transformer operator for the prediction of three-dimensional turbulence
read the original abstract
Data-driven turbulence prediction methods often face challenges related to data dependency and lack of physical interpretability. In this paper, we propose a physics-informed Transformer operator (PITO) and its implicit variant (PIITO) for predicting three-dimensional (3D) turbulence, which are developed based on the vision Transformer (ViT) architecture with an appropriate patch size. Given the current flow field, the Transformer operator computes its prediction for the next time step. By embedding the large-eddy simulation (LES) equations into the loss function, PITO and PIITO can learn solution operators without using labeled data. Furthermore, PITO can automatically learn the subgrid scale (SGS) coefficient using a single set of flow data during training. Both PITO and PIITO exhibit excellent stability and accuracy on the predictions of various statistical properties and flow structures for the situation of long-term extrapolation exceeding 25 times the training horizon in decaying homogeneous isotropic turbulence (HIT), and outperform the physics-informed Fourier neural operator (PIFNO). Furthermore, PITO exhibits a remarkable accuracy on the predictions of forced HIT where PIFNO fails. Notably, PITO and PIITO reduce GPU memory consumption by 79.5% and 91.3% while requiring only 31.5% and 3.1% of the parameters, respectively, compared to PIFNO. Moreover, both PITO and PIITO models are much faster compared to traditional LES method.
This paper has not been read by Pith yet.
Forward citations
Cited by 1 Pith paper
-
Large-eddy simulation nets (LESnets) based on physics-informed neural operator for wall-bounded turbulence
LESnets integrates LES equations and the law of the wall into F-FNO to enable data-free, stable long-term predictions of wall-bounded turbulence at Re_tau up to 1000 on coarse grids, matching traditional LES accuracy ...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.