pith. machine review for the scientific record. sign in
def definition def or abbrev high

shannonEntropy

show as:
view Lean formalization →

Shannon entropy for a discrete probability distribution over n outcomes is the negative sum of each probability times its natural logarithm. Information theorists working in Recognition Science cite this when equating entropy to expected J-cost over ledger states. The definition implements the formula directly with a guard clause that sets the contribution of any zero probability to zero.

claimLet $d$ be a probability distribution over $n$ outcomes with probabilities $p_i$. The Shannon entropy is $H(d) = -∑_i p_i ln p_i$, where the summand is replaced by zero whenever $p_i=0$.

background

The module derives Shannon entropy from the J-cost structure of Recognition Science. A ProbDist n is a structure with a map probs from Fin n to reals, together with proofs that every entry is non-negative and that the entries sum to one. The module documentation states that entropy quantifies uncertainty and emerges as the expected J-cost, where J(x) = ½(x + 1/x) - 1 measures recognition effort on probability ratios.

proof idea

This is a direct definition. The body negates a sum over Finset.univ, multiplying each positive probability by its natural log and replacing the contribution of any zero probability by zero.

why it matters in Recognition Science

The definition anchors the theorems in InformationIsLedger that prove entropy non-negativity, equality to total J-cost, and maximality for the uniform distribution. It realizes the module's target of obtaining information measures from RS J-cost minimization and connects to the Recognition Composition Law. It touches the scaling of entropy with the phi-ladder in the mass formula.

scope and limits

formal statement (Lean)

  67noncomputable def shannonEntropy {n : ℕ} (d : ProbDist n) : ℝ :=

proof body

Definition body.

  68  -(Finset.univ.sum fun i =>
  69    if d.probs i > 0 then d.probs i * Real.log (d.probs i)
  70    else 0)
  71
  72/-- **THEOREM**: Entropy is non-negative. -/

used by (9)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (6)

Lean names referenced from this declaration's body.