pith. machine review for the scientific record. sign in
def definition def or abbrev high

englishEntropyRate

show as:
view Lean formalization →

English text receives a fixed entropy rate of 1.2 bits per symbol. Workers on J-cost bounds for compression in Recognition Science cite this constant when comparing empirical rates to the Shannon limit. The declaration is a direct constant assignment with no computation or lemmas.

claimThe entropy rate of English text is defined to be $1.2$ bits per symbol.

background

The Information.Compression module derives fundamental limits on data compression from J-cost under INFO-003. Shannon's Source Coding Theorem supplies the baseline: average code length is at least the entropy $H(X) = -∑ p(x) log₂ p(x)$, and no scheme can beat this limit. In Recognition Science, information carries J-cost; compression lowers that cost by increasing organization, so the entropy limit equals the minimum J-cost for faithful representation.

proof idea

Direct constant definition that assigns the value 1.2 to the identifier.

why it matters in Recognition Science

The definition supplies the numerical English entropy rate used when the module treats compression as J-cost minimization. It anchors the Shannon limit inside the RS setting where entropy equals minimum J-cost. The value supports later comparisons between empirical rates and the phi-ladder structure of the forcing chain.

scope and limits

formal statement (Lean)

 214noncomputable def englishEntropyRate : ℝ := 1.2  -- bits

proof body

Definition body.

 215