shannonEntropy
Shannon entropy for a discrete probability distribution over n outcomes is the negative sum of each probability times its natural logarithm. Information theorists working in Recognition Science cite this when equating entropy to expected J-cost over ledger states. The definition implements the formula directly with a guard clause that sets the contribution of any zero probability to zero.
claimLet $d$ be a probability distribution over $n$ outcomes with probabilities $p_i$. The Shannon entropy is $H(d) = -∑_i p_i ln p_i$, where the summand is replaced by zero whenever $p_i=0$.
background
The module derives Shannon entropy from the J-cost structure of Recognition Science. A ProbDist n is a structure with a map probs from Fin n to reals, together with proofs that every entry is non-negative and that the entries sum to one. The module documentation states that entropy quantifies uncertainty and emerges as the expected J-cost, where J(x) = ½(x + 1/x) - 1 measures recognition effort on probability ratios.
proof idea
This is a direct definition. The body negates a sum over Finset.univ, multiplying each positive probability by its natural log and replacing the contribution of any zero probability by zero.
why it matters in Recognition Science
The definition anchors the theorems in InformationIsLedger that prove entropy non-negativity, equality to total J-cost, and maximality for the uniform distribution. It realizes the module's target of obtaining information measures from RS J-cost minimization and connects to the Recognition Composition Law. It touches the scaling of entropy with the phi-ladder in the mass formula.
scope and limits
- Does not prove non-negativity of the entropy value.
- Does not treat continuous distributions.
- Does not include the factor 1/ln(2) for conversion to bits.
- Does not compute total J-cost; that equality is a separate theorem.
formal statement (Lean)
67noncomputable def shannonEntropy {n : ℕ} (d : ProbDist n) : ℝ :=
proof body
Definition body.
68 -(Finset.univ.sum fun i =>
69 if d.probs i > 0 then d.probs i * Real.log (d.probs i)
70 else 0)
71
72/-- **THEOREM**: Entropy is non-negative. -/