Jlog_nonneg
Jlog is nonnegative for every real argument. Cost analysts in the Recognition framework cite this when establishing global minima for the log-domain cost and when proving nonnegativity of ledger posting discrepancies. The proof is a one-line wrapper that applies Jcost_nonneg after substituting the strictly positive exponential.
claimFor all real $t$, $0 ≤ Jlog(t)$, where $Jlog(t) := Jcost(exp t)$ and $Jcost(x) ≥ 0$ for all $x > 0$ by the AM-GM inequality.
background
The Cost module introduces Jcost as the base cost satisfying the Recognition Composition Law, with the explicit form $J(x) = (x + x^{-1})/2 - 1$ for $x > 0$. Jlog is the composition $Jcost ∘ exp$, shifting the cost into additive logarithmic coordinates. This construction appears throughout the module and its FlogEL sibling.
proof idea
The proof is a one-line wrapper that applies the lemma Jcost_nonneg to the fact that Real.exp t is positive.
why it matters in Recognition Science
This lemma is invoked by EL_global_min to establish that Jlog attains its global minimum at zero, and by ledgerJlogCost_nonneg to bound total costs in ledger transitions. It supplies the basic nonnegativity step required by the J-uniqueness property at T5 of the forcing chain and supports downstream extraction of constants such as alpha inverse inside (137.030, 137.039).
scope and limits
- Does not claim nonnegativity for non-real arguments.
- Does not prove strict positivity except at the zero point.
- Does not derive the explicit cosh form of Jlog.
- Does not extend to vector-valued or multidimensional costs.
formal statement (Lean)
234lemma Jlog_nonneg (t : ℝ) : 0 ≤ Jlog t :=
proof body
Term-mode proof.
235 Jcost_nonneg (Real.exp_pos t)
236
237/-- J(x) > 0 for x ≠ 1 and x > 0. -/