pith. machine review for the scientific record. sign in
theorem proved tactic proof high

washburn_uniqueness_aczel

show as:
view Lean formalization →

Washburn's uniqueness theorem shows that any continuous reciprocal cost on positive reals obeying the Recognition Composition Law, normalization at unity, and unit calibration must equal the J-cost. Researchers citing T5 in the forcing chain or logical derivations of cost invoke this result. The proof converts the composition law to a d'Alembert equation on the shifted H function, applies the Aczél smoothness lemma to identify the cosh solution, and matches it to Jcost via the exponential substitution.

claimLet $F : (0,∞) → ℝ$ satisfy $F(x) = F(x^{-1})$ for all $x > 0$, $F(1) = 0$, the Recognition Composition Law $F(xy) + F(x/y) = 2F(x)F(y) + 2F(x) + 2F(y)$ for all $x,y > 0$, the calibration condition that the second derivative of $t ↦ F(e^t)$ at $t=0$ equals 1, and continuity on $(0,∞)$. Then $F(x) = J(x)$ for all $x > 0$, where $J(x) = (x + x^{-1})/2 - 1$.

background

The module supplies lemmas for the T5 cost uniqueness proof in the Recognition Science forcing chain. Reciprocal cost requires $F(x) = F(1/x)$ for $x > 0$; normalized means $F(1) = 0$; the composition law is the Recognition Composition Law (RCL) on positive ratios; calibrated is the condition $G''(0) = 1$ where $G(t) = F(e^t)$. The Aczél smoothness package guarantees that continuous solutions of the d'Alembert equation $H(t+u) + H(t-u) = 2H(t)H(u)$ with $H(0)=1$ are $C^∞$. Upstream, the composition law is equivalent to the cosh-add identity on $G$, and $H$ is the shift $J + 1$ that turns the RCL into standard d'Alembert form.

proof idea

Fix $x > 0$. Reciprocity gives symmetry $F(y) = F(y^{-1})$. The equivalence lemma converts the composition law into CoshAddIdentity on $G_F$. Define $G_F$ and $H_F = G_F + 1$; record $G_F(0) = 0$ and $H_F(0) = 1$ from normalization, obtain continuity of both transforms, derive the direct cosh-add on $G_F$, and obtain the d'Alembert equation on $H_F$. Calibration supplies the second derivative of $H_F$ at zero equal to 1. The d'Alembert_cosh_solution_aczel lemma then forces $H_F(t) = cosh(t)$, so $G_F(t) = cosh(t) - 1$. The change of variables $x = exp(log x)$ matches this to the known expression for Jcost.

why it matters in Recognition Science

This supplies the clean T5 uniqueness statement in the forcing chain, feeding directly into cost_algebra_unique_aczel and the logical formalization results J_is_unique_cost_under_logic and RCL_is_unique_functional_form_of_logic. It anchors the derivation of the phi-ladder and alpha band from the single functional equation by internalizing the Aczél axiom and removing extra regularity hypotheses. The result thereby closes the uniqueness argument for the canonical reciprocal cost under the Recognition Composition Law.

scope and limits

formal statement (Lean)

1076theorem washburn_uniqueness_aczel (F : ℝ → ℝ)
1077    [AczelSmoothnessPackage]
1078    (hRecip : IsReciprocalCost F)
1079    (hNorm : IsNormalized F)
1080    (hComp : SatisfiesCompositionLaw F)
1081    (hCalib : IsCalibrated F)
1082    (hCont : ContinuousOn F (Set.Ioi 0)) :
1083    ∀ x : ℝ, 0 < x → F x = Cost.Jcost x := by

proof body

Tactic-mode proof.

1084  intro x hx
1085  have hSymm : ∀ {y}, 0 < y → F y = F y⁻¹ := fun {y} hy => hRecip y hy
1086  have hCoshAdd : CoshAddIdentity F := composition_law_equiv_coshAdd F |>.mp hComp
1087  let Gf : ℝ → ℝ := G F
1088  let Hf : ℝ → ℝ := H F
1089  have h_G0 : Gf 0 = 0 := G_zero_of_unit F hNorm
1090  have h_H0 : Hf 0 = 1 := by
1091    show H F 0 = 1
1092    simp only [H, G, Real.exp_zero]
1093    rw [hNorm]; ring
1094  have h_G_cont : Continuous Gf := by
1095    have h := ContinuousOn.comp_continuous hCont continuous_exp
1096    have h' : Continuous (fun t => F (Real.exp t)) :=
1097      h (by intro t; exact Set.mem_Ioi.mpr (Real.exp_pos t))
1098    simp [Gf, G] at h'
1099    exact h'
1100  have h_H_cont : Continuous Hf := by
1101    simpa [Hf, H] using h_G_cont.add continuous_const
1102  have h_direct : DirectCoshAdd Gf := CoshAddIdentity_implies_DirectCoshAdd F hCoshAdd
1103  have h_dAlembert : ∀ t u, Hf (t + u) + Hf (t - u) = 2 * Hf t * Hf u := by
1104    intro t u
1105    have hG := h_direct t u
1106    have h_goal : (Gf (t + u) + 1) + (Gf (t - u) + 1) = 2 * (Gf t + 1) * (Gf u + 1) := by
1107      calc (Gf (t + u) + 1) + (Gf (t - u) + 1)
1108          = (Gf (t + u) + Gf (t - u)) + 2 := by ring
1109        _ = (2 * (Gf t * Gf u) + 2 * (Gf t + Gf u)) + 2 := by simp [hG]
1110        _ = 2 * (Gf t + 1) * (Gf u + 1) := by ring
1111    simp [Hf, H, Gf] at h_goal
1112    exact h_goal
1113  have h_H_d2 : deriv (deriv Hf) 0 = 1 := by
1114    have hG_d2 : deriv (deriv Gf) 0 = 1 := by simpa [Gf, G] using hCalib
1115    have hderiv : deriv Hf = deriv Gf := by
1116      funext t; change deriv (fun y => Gf y + 1) t = deriv Gf t
1117      exact (deriv_add_const (f := Gf) (x := t) (c := (1 : ℝ)))
1118    have hderiv2 : deriv (deriv Hf) = deriv (deriv Gf) := congrArg deriv hderiv
1119    exact (congrArg (fun g => g 0) hderiv2).trans hG_d2
1120  have h_H_cosh : ∀ t, Hf t = Real.cosh t :=
1121    dAlembert_cosh_solution_aczel Hf h_H0 h_H_cont h_dAlembert h_H_d2
1122  have h_G_cosh : ∀ t, Gf t = Real.cosh t - 1 := fun t => by
1123    have : Gf t + 1 = Real.cosh t := h_H_cosh t
1124    linarith
1125  have ht : Real.exp (Real.log x) = x := Real.exp_log hx
1126  have hJG : G Cost.Jcost (Real.log x) = Real.cosh (Real.log x) - 1 :=
1127    Jcost_G_eq_cosh_sub_one (Real.log x)
1128  calc F x
1129      = F (Real.exp (Real.log x)) := by rw [ht]
1130    _ = Gf (Real.log x) := rfl
1131    _ = Real.cosh (Real.log x) - 1 := h_G_cosh (Real.log x)
1132    _ = G Cost.Jcost (Real.log x) := by simp only [hJG]
1133    _ = Cost.Jcost (Real.exp (Real.log x)) := by simp [G]
1134    _ = Cost.Jcost x := by simp [ht]
1135
1136end FunctionalEquation
1137end Cost
1138end IndisputableMonolith

used by (7)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (25)

Lean names referenced from this declaration's body.