pith. machine review for the scientific record. sign in
lemma proved term proof high

log2_two

show as:
view Lean formalization →

The lemma establishes that the base-2 logarithm of 2 equals 1. Researchers deriving Shannon entropy bounds from J-cost minimization in Recognition Science cite this identity when anchoring calculations for binary messages or fair-coin entropy. The proof is a direct term application of the general log_b b = 1 property, with the base condition discharged by norm_num.

claim$log_2 2 = 1$

background

In the Information.Compression module the function log2 is introduced as the real logarithm base 2 via Real.logb 2 x. This definition supports the module's target of deriving compression limits from J-cost, where entropy H(X) equals minus the sum of p(x) log2 p(x) and the minimum J-cost for faithful representation coincides with the entropy bound.

proof idea

The proof is a one-line term wrapper that applies Real.logb_self_eq_one to the specific base 2. The side condition 1 < 2 is discharged inline by norm_num.

why it matters in Recognition Science

This identity anchors the entropy calculations that feed the source coding theorem and fair-coin entropy results in the same module. It thereby supports the Recognition Science claim that compression equals J-cost minimization and that the entropy limit is the minimum J-cost for lossless representation.

scope and limits

formal statement (Lean)

  42lemma log2_two : log2 2 = 1 := Real.logb_self_eq_one (by norm_num : (1 : ℝ) < 2)

proof body

Term-mode proof.

  43
  44/-- log₂(1/2) = -1 -/

depends on (1)

Lean names referenced from this declaration's body.