lossyCompression
The lossyCompression definition supplies a string label for approximate reconstruction that accepts distortion, allowing rates below source entropy. Information theorists cite it when extending Shannon limits to rate-distortion trade-offs inside the Recognition Science J-cost framework. The realization is a direct string assignment with no computation or lemmas.
claimLossy compression is defined as approximate reconstruction that accepts distortion $D$, permitting a rate $R(D)$ below the entropy $H(X)$ by discarding high $J$-cost information.
background
The Information.Compression module derives compression limits from J-cost. Shannon's source coding theorem requires average code length at least $H(X) = -∑ p(x) log₂ p(x)$. In Recognition Science, every recognition event carries nonnegative J-cost, and compression lowers total J-cost by increasing organization.
proof idea
Direct string definition assigning the literal value. No lemmas from upstream cost definitions or forcing structures are invoked; the body is a one-line string literal.
why it matters in Recognition Science
This definition completes the lossy case inside INFO-003 on data compression limits from J-cost. It supports the module claim that compression equals J-cost minimization and prepares extensions toward rate-distortion theory while remaining consistent with the Recognition Composition Law.
scope and limits
- Does not supply a formula for the rate-distortion function R(D).
- Does not prove achievability or converse bounds.
- Does not reference the phi-ladder or eight-tick octave.
- Does not quantify distortion in terms of specific J-cost thresholds.
formal statement (Lean)
137def lossyCompression : String :=
proof body
Definition body.
138 "Approximate reconstruction, accepts distortion"
139
140/-- Rate-distortion theory:
141
142 R(D) = minimum rate for distortion ≤ D
143
144 Trade-off between compression and quality.
145
146 In RS: Which ledger information to discard
147 based on J-cost importance. -/