pith. machine review for the scientific record. sign in
module module high

IndisputableMonolith.Statistics.BayesianFilteringFromVFE

show as:
view Lean formalization →

This module constructs Bayesian filtering from variational free energy on finite recognition partitions. It supplies definitions for likelihood, evidence, single-step updates, and iterated filters, ending with a certification theorem. The structure follows directly from the monotone descent of VFE under ledger updates in the imported module.

claimThe module defines the Bayesian update step and certifies that iterated filtering on a finite partition satisfies the VFE-minimizing property: $F[q_{t+1};p] < F[q_t;p]$ under the recognition ledger update.

background

The module sits in the statistics domain and imports the VFE construction. Upstream, the Friston VFE is $F[q;p] = E_q[E] + KL[q || p_{prior}]$, equipped with monotone descent under ledger updates on a finite recognition partition. Local definitions introduce Likelihood as the observation model, evidence as the marginal likelihood, bayesStep as the single posterior update, and iterFilter as the recursive iteration over observation sequences with nil and cons cases.

proof idea

This is a definition module. It introduces the component definitions in sequence, states the nil and cons cases for iterFilter, and closes with the certification theorem bayesianFilteringCert_holds that the iterated procedure preserves the VFE descent property.

why it matters in Recognition Science

The module supplies the filtering layer that turns VFE descent into an explicit inference procedure on finite partitions. It realizes the statistical update step that sits between the RCL-derived VFE and higher-level recognition applications, though no downstream uses are recorded yet.

scope and limits

depends on (1)

Lean names referenced from this declaration's body.

declarations in this module (10)