pith. machine review for the scientific record. sign in
theorem proved term proof high

subK_propagation

show as:
view Lean formalization →

Sub-Kolmogorov bounds on velocity gradients propagate forward under the discrete Navier-Stokes evolution on the RS lattice. Discrete regularity analysts cite the result to close the self-improving property and remove external hypotheses. The argument is a direct induction on the time index that applies order transitivity at each successor step.

claimLet $(g_n)_{n=0}^∞$ be a sequence of real numbers and let $B ∈ ℝ$. If $g_0 ≤ B$ and $g_{n+1} ≤ g_n$ for every natural number $n$, then $g_n ≤ B$ for all $n$.

background

The module proves that the sub-Kolmogorov condition is self-improving under discrete Navier-Stokes evolution on the RS lattice, making lattice regularity unconditional. The sub-Kolmogorov condition is the requirement that the velocity gradient stays below the viscous scale $ν/h²$ at every time step. The local setting is a finite lattice with finite-energy initial data, where the discrete maximum principle ensures viscous contraction dominates advection when the bound holds.

proof idea

The proof is a direct induction on the time index n. The base case at zero applies the initial bound hypothesis. The successor case invokes the step hypothesis to obtain the bound at k+1 from the bound at k, then combines it with the inductive hypothesis via order transitivity.

why it matters in Recognition Science

This supplies the propagation step for unconditional_gradient_bound, which in turn feeds master_lattice_regularity. By eliminating the external sub-Kolmogorov hypothesis, it renders the discrete Navier-Stokes regularity result fully unconditional on the finite RS lattice. The module cites Proposition III.1 of Thapa & Washburn on strong convexity and spectral gap for the torus.

scope and limits

Lean usage

theorem unconditional_gradient_bound (gradients : ℕ → ℝ) (nu h : ℝ) (hnu : 0 < nu) (hh : 0 < h) (h_finite_init : gradients 0 ≤ nu / h ^ 2) (h_nonincreasing : ∀ n, gradients (n + 1) ≤ gradients n) : ∀ n, gradients n ≤ nu / h ^ 2 := subK_propagation gradients (nu / h ^ 2) h_finite_init h_nonincreasing

formal statement (Lean)

 115theorem subK_propagation
 116    (gradients : ℕ → ℝ) (bound : ℝ)
 117    (h_init : gradients 0 ≤ bound)
 118    (h_step : ∀ n, gradients (n + 1) ≤ gradients n) :
 119    ∀ n, gradients n ≤ bound := by

proof body

Term-mode proof.

 120  intro n
 121  induction n with
 122  | zero => exact h_init
 123  | succ k ih => exact le_trans (h_step k) ih
 124
 125/-- The lattice regularity theorem: on the RS lattice with finite-energy
 126initial data, the velocity gradient is bounded for all time.
 127
 128This is unconditional: the sub-Kolmogorov condition at t=0 follows from
 129finiteness of the initial data on the finite lattice (max|grad u_0| < infty
 130on any finite lattice), and the self-improving property propagates it. -/

used by (2)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (23)

Lean names referenced from this declaration's body.