pith. machine review for the scientific record. sign in
lemma proved term proof high

Jlog_nonneg

show as:
view Lean formalization →

Jlog is nonnegative for every real argument. Cost analysts in the Recognition framework cite this when establishing global minima for the log-domain cost and when proving nonnegativity of ledger posting discrepancies. The proof is a one-line wrapper that applies Jcost_nonneg after substituting the strictly positive exponential.

claimFor all real $t$, $0 ≤ Jlog(t)$, where $Jlog(t) := Jcost(exp t)$ and $Jcost(x) ≥ 0$ for all $x > 0$ by the AM-GM inequality.

background

The Cost module introduces Jcost as the base cost satisfying the Recognition Composition Law, with the explicit form $J(x) = (x + x^{-1})/2 - 1$ for $x > 0$. Jlog is the composition $Jcost ∘ exp$, shifting the cost into additive logarithmic coordinates. This construction appears throughout the module and its FlogEL sibling.

proof idea

The proof is a one-line wrapper that applies the lemma Jcost_nonneg to the fact that Real.exp t is positive.

why it matters in Recognition Science

This lemma is invoked by EL_global_min to establish that Jlog attains its global minimum at zero, and by ledgerJlogCost_nonneg to bound total costs in ledger transitions. It supplies the basic nonnegativity step required by the J-uniqueness property at T5 of the forcing chain and supports downstream extraction of constants such as alpha inverse inside (137.030, 137.039).

scope and limits

formal statement (Lean)

 234lemma Jlog_nonneg (t : ℝ) : 0 ≤ Jlog t :=

proof body

Term-mode proof.

 235  Jcost_nonneg (Real.exp_pos t)
 236
 237/-- J(x) > 0 for x ≠ 1 and x > 0. -/

used by (6)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (12)

Lean names referenced from this declaration's body.