IndisputableMonolith.Information.JCostNecessity
This module axiomatizes the Recognition Information Cost as a function F on positive reals that is symmetric under inversion, vanishes at unity, and is strictly convex. Researchers deriving information measures or minimum description length priors from Recognition Science principles cite it to ground the cost axioms. The module imports convexity results from Cost.Convexity to support the equilibrium uniqueness without internal proofs.
claimA function $F: (0,∞) → ℝ$ satisfies the Recognition Information Cost axioms when $F(x) = F(x^{-1})$, $F(1) = 0$, and $F$ is strictly convex.
background
The module resides in the Information domain and imports IndisputableMonolith.Cost together with IndisputableMonolith.Cost.Convexity. The upstream Convexity module proves that Jlog(t) = cosh(t) − 1 is strictly convex on ℝ and that Jcost(x) = ½(x + x⁻¹) − 1 is strictly convex on ℝ₊; these facts are stated to be foundational for uniqueness theorem T5. The local theoretical setting introduces the cost function F via the three axioms given in the module documentation: symmetry for bi-directional recognition, a minimum of zero at the balanced state, and strict convexity for a unique stable equilibrium.
proof idea
This is a definition module, no proofs.
why it matters in Recognition Science
The module supplies the axiomatic base for the parent Information aggregator, which in turn supports CompressionPrior by grounding minimum description length in J-cost. It fills the necessity step for the cost function in the Recognition Science framework and connects directly to the J-uniqueness result T5 in the forcing chain.
scope and limits
- Does not derive an explicit closed-form expression for F.
- Does not prove that Jcost satisfies the axioms (handled in sibling declarations).
- Does not address thermodynamic or physical-unit interpretations.
- Does not connect the axioms to the phi-ladder or mass formulas.