pith. machine review for the scientific record. sign in
def definition def or abbrev high

biasedCoinEntropy

show as:
view Lean formalization →

biasedCoinEntropy defines the Shannon entropy of a coin flip with heads probability 0.9 as the explicit sum -0.9 log2(0.9) minus 0.1 log2(0.1). Information theorists working inside the Recognition Science model cite it to exhibit an entropy value below one bit that still obeys the J-cost minimum. The definition is a direct arithmetic expression that invokes only the module's log2 helper.

claimFor a biased coin with $P(H)=0.9$ and $P(T)=0.1$, the entropy is $H=-0.9 log_2(0.9)-0.1 log_2(0.1)$ (approximately 0.47 bits).

background

The Information.Compression module derives data-compression limits from J-cost. Its module doc states that entropy equals the minimum J-cost for faithful representation and that compression algorithms seek this minimum. Upstream, entropy is defined in InitialCondition as total_defect of a configuration, while cost in ObserverForcing and MultiplicativeRecognizerL4 both return the J-cost of a recognition event or comparator.

proof idea

The definition is a direct one-line arithmetic expression that applies the log2 helper from the same module to the two probability terms.

why it matters in Recognition Science

The definition supplies a concrete numerical illustration for the module's claim that compression is J-cost minimization and that the entropy bound is the minimum achievable J-cost. It sits inside the INFO-003 development that links Shannon's source coding theorem to the Recognition Science mechanism of organized information having lower J-cost.

scope and limits

formal statement (Lean)

  93noncomputable def biasedCoinEntropy : ℝ :=

proof body

Definition body.

  94  -0.9 * log2 0.9 - 0.1 * log2 0.1
  95
  96/-! ## J-Cost Connection -/
  97
  98/-- In RS, compression is J-cost minimization:
  99
 100    **Uncompressed data**: High redundancy = High J-cost
 101    **Compressed data**: No redundancy = Low J-cost
 102    **Perfect compression**: J-cost = entropy (minimum)
 103
 104    Compression algorithms seek minimum J-cost! -/

depends on (11)

Lean names referenced from this declaration's body.