pith. machine review for the scientific record. sign in
def definition def or abbrev high

summary

show as:
view Lean formalization →

This definition supplies a five-point summary linking data compression limits to J-cost minimization in Recognition Science. Information theorists working within the RS framework would cite it when restating Shannon entropy as the minimum J-cost for faithful representation. The definition is a direct static list construction with no computation or lemma application.

claimThe summary is the list: compression limit equals entropy $H(X)$, compression equals J-cost minimization, entropy equals minimum J-cost for faithful representation, redundancy equals removable excess J-cost, and random data is already at minimum J-cost.

background

The Information.Compression module derives fundamental limits on lossless compression from J-cost, reinterpreting Shannon's source coding theorem. Upstream, the shifted cost is defined as $H(x) = J(x) + 1$, satisfying the d'Alembert equation $H(xy) + H(x/y) = 2 H(x) H(y)$. Entropy is defined as total_defect of a configuration, with zero defect yielding the minimum entropy state. Cost functions from multiplicative recognizers and recognition events are identified with J-cost.

proof idea

This is a definition that directly constructs the List String containing the five summary statements. No lemmas are applied and no tactics are used; it is a static list definition.

why it matters in Recognition Science

This definition supplies the high-level summary for compression limits derived from J-cost, connecting Shannon entropy to the minimum J-cost for faithful representation. It aligns with the Recognition framework's treatment of information cost and the forcing chain steps that establish J-uniqueness. It touches the open question of whether J-cost reduction can be measured directly in practical compression schemes.

scope and limits

formal statement (Lean)

 229def summary : List String := [

proof body

Definition body.

 230  "Compression limit = entropy H(X)",
 231  "Compression = J-cost minimization",
 232  "Entropy = minimum J-cost for faithful representation",
 233  "Redundancy = removable excess J-cost",
 234  "Random data already at minimum J-cost"
 235]
 236
 237/-! ## Falsification Criteria -/
 238
 239/-- The derivation would be falsified if:
 240    1. Compression below entropy achieved
 241    2. J-cost doesn't decrease with compression
 242    3. Random data can be systematically compressed -/

depends on (8)

Lean names referenced from this declaration's body.