pith. machine review for the scientific record. sign in
theorem proved term proof high

bayesStep_is_update

show as:
view Lean formalization →

The declaration shows that the Bayesian step constructed from variational free energy equals the standard update formula that multiplies the prior by the likelihood and normalizes by the evidence. Workers on discrete filtering or derivations of Bayesian methods from free energy would cite this to verify the equivalence. The proof proceeds by introducing the outcome index and applying reflexivity, since the step definition already uses the normalized product.

claimLet $P$ be a discrete probability distribution over a type $ι$, let $L$ be a likelihood structure providing a positive function from observations $Obs$ to $ι → ℝ$, and let $o$ be an observation. Then for every outcome $i ∈ ι$ the updated probability is given by $P(i) · L(o,i) / ∑_j P(j) · L(o,j)$.

background

The module establishes that discrete Bayesian filtering arises as the one-step minimizer of variational free energy obtained from the Recognition Composition Law. A probability distribution consists of a vector of nonnegative reals summing to one. The likelihood structure supplies a positive real-valued function that maps each observation and outcome to a likelihood value. The evidence function computes the marginal probability of an observation as the sum over outcomes of the product of prior probability and likelihood. The Bayesian step then defines the posterior probabilities via the normalized product. Upstream, the probability distribution structure is imported from the Shannon entropy module.

proof idea

The proof is a term-mode reflexivity after introducing the index $i$. This suffices because the definition of the Bayesian step directly assigns its probability component to the expression prior probability times likelihood divided by evidence.

why it matters in Recognition Science

This result is used by the theorem that certifies the Bayesian filtering property to confirm that the filtering procedure satisfies the Bayesian update property. It thereby links the variational free energy construction in the module to classical Bayesian inference. In the Recognition Science setting it shows how the update rule follows from the composition law without extra postulates, supporting the broader claim that filtering emerges from the J-cost on the recognition ladder.

scope and limits

formal statement (Lean)

  43theorem bayesStep_is_update (prior : ProbDist ι) (L : Likelihood ι Obs) (o : Obs) :
  44    ∀ i, (bayesStep prior L o).prob i =
  45      prior.prob i * L.like o i / evidence prior L o := by

proof body

Term-mode proof.

  46  intro i
  47  rfl
  48

used by (1)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (8)

Lean names referenced from this declaration's body.