pith. machine review for the scientific record. sign in
def definition def or abbrev high

totalJCost

show as:
view Lean formalization →

This definition computes the aggregate J-cost of a discrete probability distribution by summing each positive probability multiplied by its individual J-cost term. Information theorists working in Recognition Science cite it when linking Shannon entropy to the underlying cost structure. The implementation is a direct Finset summation that applies probJCost only on positive probabilities and invokes normalization to bound the arithmetic.

claimLet $d$ be a probability distribution over $n$ outcomes with probabilities $p_i$. The total J-cost is defined as the sum over all $i$ with $p_i > 0$ of $p_i$ times the J-cost of $p_i$, where the J-cost function is the one induced by the multiplicative recognizer on positive ratios.

background

The Information.ShannonEntropy module derives Shannon entropy from Recognition Science's J-cost. A ProbDist n is the structure whose field probs : Fin n → ℝ satisfies nonnegativity for every index and normalization (the Finset sum equals 1). The J-cost itself is the function J(x) = ½(x + 1/x) - 1 applied to probability values via the helper probJCost. Upstream, the cost definition in ObserverForcing states that the cost of any recognition event is its J-cost, while the entropy definition in InitialCondition equates entropy to total defect.

proof idea

The definition is a one-line wrapper that expands to a Finset.univ sum. For each index i the term is p_i * probJCost(p_i) when p_i > 0 and 0 otherwise. The positivity hypothesis is supplied directly; the auxiliary inequality needed by probJCost is discharged by single_le_sum on the normalized sum followed by linarith.

why it matters in Recognition Science

totalJCost supplies the right-hand side for the theorem shannon_entropy_equals_expected_jcost, which proves Shannon entropy equals expected J-cost and thereby confirms consistency with classical information theory. It fills the INFO-001 target stated in the module doc and is invoked in BalancedJSubsetSum results that bound cryptographic solution costs. The construction rests on the J-uniqueness property (T5) and the eight-tick octave structure of the forcing chain.

scope and limits

formal statement (Lean)

 137noncomputable def totalJCost {n : ℕ} (d : ProbDist n) : ℝ :=

proof body

Definition body.

 138  Finset.univ.sum fun i =>
 139    if h : d.probs i > 0 then
 140      d.probs i * probJCost (d.probs i) h (by
 141        have := d.normalized
 142        have hs := Finset.single_le_sum (fun j _ => d.nonneg j) (Finset.mem_univ i)
 143        simp at hs
 144        linarith)
 145    else 0
 146
 147/-- **THEOREM (Shannon = J-Cost)**: Shannon entropy equals total J-cost.
 148    This is the key connection! -/

used by (12)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (18)

Lean names referenced from this declaration's body.