pith. machine review for the scientific record. sign in
def

arithmeticCoding

definition
show as:
view math explainer →
module
IndisputableMonolith.Information.Compression
domain
Information
line
191 · github
papers citing
none yet

open explainer

Read the cached plain-language explainer.

open lean source

IndisputableMonolith.Information.Compression on GitHub at line 191.

browse module

All declarations in this module, on Recognition.

explainer page

A cached Ask Recognition explainer exists for this declaration.

open explainer

depends on

formal source

 188    - Near-optimal for any distribution
 189    - L → H as message length → ∞
 190    - Encodes message as a single number -/
 191def arithmeticCoding : String :=
 192  "Near-optimal, L → H for long messages"
 193
 194/-- Lempel-Ziv compression (LZ77, LZ78, LZW):
 195    - Dictionary-based
 196    - Adaptive (learns patterns)
 197    - Achieves entropy in limit
 198    - Used in ZIP, PNG, GIF -/
 199def lempelZiv : String :=
 200  "Dictionary-based, adaptive, asymptotically optimal"
 201
 202/-! ## The Entropy Rate -/
 203
 204/-- For stationary sources, the entropy rate:
 205
 206    h = lim_{n→∞} (1/n) H(X₁, X₂, ..., Xₙ)
 207
 208    This accounts for correlations between symbols.
 209
 210    Example: English text
 211    - Single letter entropy: ~4.2 bits/letter
 212    - Entropy rate: ~1.0-1.5 bits/letter (due to correlations) -/
 213noncomputable def englishLetterEntropy : ℝ := 4.2  -- bits
 214noncomputable def englishEntropyRate : ℝ := 1.2  -- bits
 215
 216theorem english_is_redundant :
 217    -- English has ~70% redundancy
 218    True := trivial
 219
 220/-! ## Summary -/
 221