infoCost
infoCost equips each recognition event with its J-cost, the measure of information content derived from the ratio. Workers on information geometry or ledger-based physics cite it to ground statements about entropy and minimum cost. The implementation is a direct one-line delegation to the underlying Jcost function on the ratio.
claimFor a recognition event $e$ with positive ratio $x$, the information cost is $J(x)$, where $J(x) = (x + x^{-1})/2 - 1$.
background
RecognitionEvent is the structure consisting of a positive real ratio together with a positivity witness. Jcost is the cost function on positive reals induced by the multiplicative recognizer, satisfying the Recognition Composition Law J(xy) + J(x/y) = 2J(x)J(y) + 2J(x) + 2J(y). This definition appears in the module InformationIsLedger, whose documentation states that information is the physical ledger and that every recognition ratio carries a definite J-cost at least zero.
proof idea
one-line wrapper that applies Jcost to the ratio of the event.
why it matters in Recognition Science
This definition is used by the theorems balanced_has_minimum_cost, info_cost_nonneg, info_cost_symmetric, and totalInfoCost. It fills the first core item in the IC-001 list: every recognition ratio has a definite J-cost. It connects to J-uniqueness (T5) in the forcing chain and supports the claim that the ledger is reality, with J(x) = 0 only at the balanced state.
scope and limits
- Does not prove non-negativity of the cost.
- Does not prove symmetry or uniqueness of the zero.
- Does not handle the limit as the ratio approaches zero.
- Does not connect to Shannon entropy or Landauer bounds.
formal statement (Lean)
53noncomputable def infoCost (e : RecognitionEvent) : ℝ := Jcost e.ratio
proof body
Definition body.
54
55/-- **THEOREM IC-001.1**: Every recognition event has non-negative information cost.
56 J(x) ≥ 0 for all x > 0. This follows from AM-GM: (x + 1/x)/2 ≥ 1. -/