pith. machine review for the scientific record. sign in
def

messageJCost

definition
show as:
view math explainer →
module
IndisputableMonolith.Information.Compression
domain
Information
line
117 · github
papers citing
none yet

open explainer

Generate a durable explainer page for this declaration.

open lean source

IndisputableMonolith.Information.Compression on GitHub at line 117.

browse module

All declarations in this module, on Recognition.

explainer page

Tracked in the explainer inventory; generation is lazy so crawlers do not trigger LLM jobs.

open explainer

depends on

formal source

 114
 115    This explains why you can't compress random data:
 116    Random data already has minimum J-cost for its entropy! -/
 117noncomputable def messageJCost (length redundancy : ℝ) : ℝ :=
 118  length * (1 - redundancy)
 119
 120/-! ## Lossless vs Lossy Compression -/
 121
 122/-- Lossless compression:
 123    - Exact reconstruction possible
 124    - Limit: H(X) bits
 125    - Examples: ZIP, PNG, FLAC
 126
 127    In RS: Preserves all ledger information -/
 128def losslessCompression : String :=
 129  "Exact reconstruction, limit = entropy"
 130
 131/-- Lossy compression:
 132    - Approximate reconstruction
 133    - Can go below H(X) with distortion
 134    - Examples: JPEG, MP3, video codecs
 135
 136    In RS: Discards high J-cost (less important) information -/
 137def lossyCompression : String :=
 138  "Approximate reconstruction, accepts distortion"
 139
 140/-- Rate-distortion theory:
 141
 142    R(D) = minimum rate for distortion ≤ D
 143
 144    Trade-off between compression and quality.
 145
 146    In RS: Which ledger information to discard
 147    based on J-cost importance. -/