pith. machine review for the scientific record. sign in
def definition def or abbrev high

Variance

show as:
view Lean formalization →

The Variance definition computes the total sum of squared deviations from the mean for any real-valued function on the Boolean hypercube of dimension n. Researchers analyzing convergence rates of gradient descent on J-cost landscapes cite it to establish non-negativity and constancy properties inside spectral gap certificates. It is realized as an explicit finite sum over all 2^n assignments with no further lemmas required.

claimFor any function $x : (Fin n → Bool) → ℝ$, define Variance$(x) := ∑_{a} (x(a) - (1/2^n) ∑_{b} x(b))^2$, where the sums run over the hypercube {0,1}^n.

background

The SpectralGap module studies the second eigenvalue λ₂ of the J-cost Laplacian, which governs the rate at which recognition dynamics (modeled as gradient descent on the J-cost landscape) converge to a minimum. Variance supplies the non-negative spread measure used to certify that non-constant functions on assignments have positive gap contribution. The module lists Variance together with empty_formula_flat_landscape and ConvergenceRate as core ingredients for iteration bounds that scale with the number of clauses divided by λ₂.

proof idea

This is a direct definition that expands to the sum of squared deviations from the mean value of x. No lemmas are invoked; the expression is the explicit construction later unfolded by variance_nonneg (via Finset.sum_nonneg and sq_nonneg) and variance_const_zero (via sub_self and sq).

why it matters in Recognition Science

Variance is the concrete non-negative quantity appearing in the variance_nonneg_cert field of SpectralGapCert, which in turn extracts a positive min_sensitivity from any UNSATGapCondition. It therefore supports the claim that unsatisfiable formulas possess a positive spectral gap, feeding the convergence-rate analysis that links the J-cost Laplacian to the Recognition framework's control of mixing times. The definition closes the scaffolding path toward the Cheeger-type inequality still marked with a sorry in unsat_has_spectral_gap.

scope and limits

Lean usage

theorem use_variance {n : ℕ} (x : (Fin n → Bool) → ℝ) : 0 ≤ Variance x := variance_nonneg x

formal statement (Lean)

  33def Variance {n : ℕ} (x : (Fin n → Bool) → ℝ) : ℝ :=

proof body

Definition body.

  34  Finset.univ.sum (fun a : Fin n → Bool =>
  35    (x a - Finset.univ.sum (fun b : Fin n → Bool => x b) /
  36           (Finset.univ.card (α := Fin n → Bool) : ℝ))^2)
  37

used by (4)

From the project-wide theorem graph. These declarations reference this one in their body.