pith. machine review for the scientific record. sign in
def definition def or abbrev high

probJCost

show as:
view Lean formalization →

The definition assigns to each probability p in (0,1] the value -ln p as its J-cost, measuring surprisal relative to uniform. Information theorists working in the Recognition Science framework cite this when linking entropy to recognition effort. It is realized as a direct one-line assignment using the natural logarithm.

claimFor a real number $p$ with $0 < p ≤ 1$, the J-cost of the probability is $-ln p$.

background

The module Information.ShannonEntropy derives Shannon entropy from Recognition Science's J-cost. J-cost quantifies recognition effort for positive ratios, as defined in upstream results including MultiplicativeRecognizerL4.cost (the derived cost of a multiplicative recognizer's comparator) and ObserverForcing.cost (the J-cost of a recognition event). For probabilities this specializes to surprisal, capturing deviation from uniformity per the module's core insight that information measures distance from the uniform distribution. The definition draws on cost structures from PhiForcingDerived.of and DAlembert.LedgerFactorization.of, which calibrate J for multiplicative settings.

proof idea

This is a one-line definition that sets the value directly to -Real.log p under the supplied positivity and upper-bound hypotheses on p.

why it matters in Recognition Science

This definition is invoked in totalJCost and shannon_equals_jcost to establish that Shannon entropy equals total J-cost summed over a distribution. It realizes the module target of deriving information theory from RS J-cost, connecting to the Recognition Composition Law and phi-forcing landmarks. It supports the claim that entropy emerges from recognition costs without extra structure.

scope and limits

formal statement (Lean)

 126noncomputable def probJCost (p : ℝ) (hp : p > 0) (hp1 : p ≤ 1) : ℝ :=

proof body

Definition body.

 127  -Real.log p
 128
 129/-- **THEOREM**: J-cost of uniform probability is log(n).
 130    -log(1/n) = log(n) -/

used by (3)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (15)

Lean names referenced from this declaration's body.