totalInfoCost
The total information cost of a ledger state sums the J-costs of its recognition events by left-folding addition from zero. Researchers formalizing information as ledger content in Recognition Science cite this when proving non-negativity or emptiness results for physical states. The definition is a direct accumulation that inherits non-negativity from the per-event infoCost function.
claimFor a ledger state $s$ with finite list of recognition events, the total information cost equals the sum of infoCost$(e)$ over each event $e$ in the list, where infoCost is the J-cost of the recognition ratio.
background
LedgerState is a structure holding a finite list of RecognitionEvent objects, each a recognition ratio carrying a J-cost. The module InformationIsLedger treats information as identical to the physical ledger, with J-cost zero only at balanced ratio 1 and positive otherwise. Upstream, PhiForcingDerived.of supplies the structure of J-cost while PrimitiveDistinction.from derives the four structural conditions plus three definitional facts from the seven axioms.
proof idea
One-line definition that applies foldl to accumulate infoCost over the events list starting from 0.
why it matters in Recognition Science
This definition anchors the IC-001 theorems, directly feeding total_info_cost_nonneg (non-negativity of total cost) and empty_state_zero_cost (zero cost for the empty ledger). It realizes the module claim that information equals the ledger by quantifying aggregate J-cost, linking to T5 J-uniqueness and the Recognition Composition Law.
scope and limits
- Does not apply to infinite event collections.
- Does not supply the explicit formula for infoCost on single events.
- Does not embed Shannon entropy computation.
formal statement (Lean)
177noncomputable def totalInfoCost (s : LedgerState) : ℝ :=
proof body
Definition body.
178 s.events.foldl (fun acc e => acc + infoCost e) 0
179
180/-- Helper: foldl of nonneg additions starting from 0 is nonneg. -/