probJCost
The definition assigns to each probability p in (0,1] the value -ln p as its J-cost, measuring surprisal relative to uniform. Information theorists working in the Recognition Science framework cite this when linking entropy to recognition effort. It is realized as a direct one-line assignment using the natural logarithm.
claimFor a real number $p$ with $0 < p ≤ 1$, the J-cost of the probability is $-ln p$.
background
The module Information.ShannonEntropy derives Shannon entropy from Recognition Science's J-cost. J-cost quantifies recognition effort for positive ratios, as defined in upstream results including MultiplicativeRecognizerL4.cost (the derived cost of a multiplicative recognizer's comparator) and ObserverForcing.cost (the J-cost of a recognition event). For probabilities this specializes to surprisal, capturing deviation from uniformity per the module's core insight that information measures distance from the uniform distribution. The definition draws on cost structures from PhiForcingDerived.of and DAlembert.LedgerFactorization.of, which calibrate J for multiplicative settings.
proof idea
This is a one-line definition that sets the value directly to -Real.log p under the supplied positivity and upper-bound hypotheses on p.
why it matters in Recognition Science
This definition is invoked in totalJCost and shannon_equals_jcost to establish that Shannon entropy equals total J-cost summed over a distribution. It realizes the module target of deriving information theory from RS J-cost, connecting to the Recognition Composition Law and phi-forcing landmarks. It supports the claim that entropy emerges from recognition costs without extra structure.
scope and limits
- Does not apply for p = 0 or p > 1.
- Does not treat continuous probability densities.
- Does not incorporate dimensional or octave factors from the forcing chain.
- Does not compute costs for joint or composite events.
formal statement (Lean)
126noncomputable def probJCost (p : ℝ) (hp : p > 0) (hp1 : p ≤ 1) : ℝ :=
proof body
Definition body.
127 -Real.log p
128
129/-- **THEOREM**: J-cost of uniform probability is log(n).
130 -log(1/n) = log(n) -/