pith. machine review for the scientific record. sign in
module module high

IndisputableMonolith.Information.JCostNecessity

show as:
view Lean formalization →

This module axiomatizes the Recognition Information Cost as a function F on positive reals that is symmetric under inversion, vanishes at unity, and is strictly convex. Researchers deriving information measures or minimum description length priors from Recognition Science principles cite it to ground the cost axioms. The module imports convexity results from Cost.Convexity to support the equilibrium uniqueness without internal proofs.

claimA function $F: (0,∞) → ℝ$ satisfies the Recognition Information Cost axioms when $F(x) = F(x^{-1})$, $F(1) = 0$, and $F$ is strictly convex.

background

The module resides in the Information domain and imports IndisputableMonolith.Cost together with IndisputableMonolith.Cost.Convexity. The upstream Convexity module proves that Jlog(t) = cosh(t) − 1 is strictly convex on ℝ and that Jcost(x) = ½(x + x⁻¹) − 1 is strictly convex on ℝ₊; these facts are stated to be foundational for uniqueness theorem T5. The local theoretical setting introduces the cost function F via the three axioms given in the module documentation: symmetry for bi-directional recognition, a minimum of zero at the balanced state, and strict convexity for a unique stable equilibrium.

proof idea

This is a definition module, no proofs.

why it matters in Recognition Science

The module supplies the axiomatic base for the parent Information aggregator, which in turn supports CompressionPrior by grounding minimum description length in J-cost. It fills the necessity step for the cost function in the Recognition Science framework and connects directly to the J-uniqueness result T5 in the forcing chain.

scope and limits

used by (1)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (2)

Lean names referenced from this declaration's body.

declarations in this module (3)