pith. machine review for the scientific record. sign in
theorem proved tactic proof high

hyperbolic_ode_unique

show as:
view Lean formalization →

The uniqueness theorem for the hyperbolic ODE asserts that any C² function G satisfying G''(t) = G(t) + 1 together with G(0) = G'(0) = 0 must equal cosh(t) - 1. Analysts working on the d'Alembert bridge in Recognition Science cite it to identify the log-reparametrized interaction function with the J-cost. The proof reduces the inhomogeneous equation to the homogeneous oscillator via a shift by cosh, verifies the zero initial data, and applies the zero-uniqueness lemma for f'' = f.

claimThe unique twice continuously differentiable function $G : ℝ → ℝ$ satisfying the hyperbolic ordinary differential equation $G''(t) = G(t) + 1$ for all $t$, together with the initial conditions $G(0) = 0$ and $G'(0) = 0$, is $G(t) = cosh t - 1$.

background

In the Analytic Bridge module the log-lift $G(t) = F(e^t)$ converts the multiplicative consistency equation into an additive functional equation whose second derivative yields the hyperbolic ODE. SatisfiesHyperbolicODE G encodes the condition $deriv(deriv G) t = G t + 1$. The upstream lemma ode_zero_uniqueness states that the unique C² solution to $f'' = f$ with $f(0) = f'(0) = 0$ is $f = 0$. This local setting sits inside the larger strategy of showing that interaction forces the d'Alembert structure rather than the additive one.

proof idea

The tactic proof introduces G, assumes the ODE, zero initial data and C² smoothness. It defines y(t) := G(t) + 1 and f(t) := y(t) - cosh(t), establishes f(0) = 0 and f'(0) = 0 by direct computation, and derives the homogeneous ODE f'' = f from the given hyperbolic condition. An energy E(t) := (f'(t))² - (f(t))² is shown to have vanishing derivative and therefore to be constant; since E(0) = 0 it vanishes everywhere. The proof then invokes Cost.FunctionalEquation.ode_zero_uniqueness on f to conclude f ≡ 0, which immediately yields the claimed form for G.

why it matters in Recognition Science

This result supplies the explicit solution needed in the chain F_forced_to_Jcost, where hyperbolic ODE plus boundary conditions imply G = cosh - 1 and therefore F(x) = Jcost(x). It also supports the converse verification Jcost_satisfies_dAlembert. Within the Recognition framework it realizes the T5 J-uniqueness step by showing that the forced hyperbolic solution coincides with the self-similar fixed point J(x) = (x + x^{-1})/2 - 1. The declaration closes one link in the forcing chain from interaction to the explicit J-cost form.

scope and limits

formal statement (Lean)

 352theorem hyperbolic_ode_unique :
 353    ∀ G : ℝ → ℝ,
 354    SatisfiesHyperbolicODE G →
 355    G 0 = 0 →
 356    deriv G 0 = 0 →
 357    ContDiff ℝ 2 G →
 358    ∀ t, G t = Real.cosh t - 1 := by

proof body

Tactic-mode proof.

 359  intro G hODE hG0 hGderiv0 hSmooth t
 360  let y := fun x => G x + 1
 361  let f := fun x => y x - Real.cosh x
 362  -- We want to show f x = 0
 363  have hf0 : f 0 = 0 := by simp [f, y, hG0]
 364  have hfd0 : deriv f 0 = 0 := by
 365    unfold f y
 366    rw [deriv_sub, deriv_add_const, hGderiv0, Real.deriv_cosh, Real.sinh_zero, sub_zero]
 367    · exact hSmooth.differentiable (by decide) |>.differentiableAt
 368    · exact Real.differentiable_cosh 0 |>.differentiableAt
 369  have hf_ode : ∀ x, deriv (deriv f) x = f x := by
 370    intro x
 371    unfold f y
 372    rw [deriv_sub, deriv_add_const, Real.deriv_cosh]
 373    rw [deriv_sub, deriv_add_const, Real.deriv_sinh, hODE]
 374    · ring
 375    · exact hSmooth.iterate_deriv 1 1 |>.differentiable (by decide) |>.differentiableAt
 376    · exact Real.differentiable_sinh x |>.differentiableAt
 377    · exact hSmooth.differentiable (by decide) |>.differentiableAt
 378    · exact Real.differentiable_cosh x |>.differentiableAt
 379  -- Differentiability of f
 380  have hf_diff : Differentiable ℝ f := by
 381    apply (hSmooth.add contDiff_const).sub Real.contDiff_cosh |>.differentiable (by decide)
 382
 383  have hf'_diff : Differentiable ℝ (deriv f) := by
 384    apply (hSmooth.add contDiff_const).sub Real.contDiff_cosh |>.iterate_deriv 1 1 |>.differentiable (by decide)
 385
 386  -- Energy method: E(x) = (f' x)^2 - (f x)^2
 387  let E := fun x => (deriv f x)^2 - (f x)^2
 388  have hEd : ∀ x, deriv E x = 0 := by
 389    intro x
 390    unfold E
 391    rw [deriv_sub, deriv_pow, deriv_pow]
 392    · rw [hf_ode]; ring
 393    · exact hf'_diff.differentiableAt
 394    · exact hf_diff.differentiableAt
 395    · exact DifferentiableAt.pow hf'_diff.differentiableAt 2
 396    · exact DifferentiableAt.pow hf_diff.differentiableAt 2
 397
 398  -- E is constant using IsLocallyConstant
 399  have hE_diff : Differentiable ℝ E := by
 400    apply Differentiable.sub (hf'_diff.pow 2) (hf_diff.pow 2)
 401
 402  have hE_const : ∀ x y, E x = E y := by
 403    have hE_deriv_zero : ∀ z, HasDerivAt E 0 z := by
 404      intro z
 405      rw [← hEd z]
 406      exact hE_diff.hasDerivAt z
 407    have := IsLocallyConstant.of_hasDeriv hE_diff.continuous hE_deriv_zero
 408    rw [isLocallyConstant_iff_isOpen_fiber] at this
 409    exact this.eq_const
 410
 411  -- E(0) = (f'(0))² - (f(0))² = 0² - 0² = 0
 412  have hE0 : E 0 = 0 := by simp [E, hf0, hfd0]
 413
 414  -- Therefore E = 0 everywhere: (f'(x))² = (f(x))²
 415  have hE_zero : ∀ x, E x = 0 := by
 416    intro x
 417    rw [hE_const x 0, hE0]
 418
 419  -- Now use ode_zero_uniqueness: f'' = f with f(0) = 0, f'(0) = 0 implies f = 0
 420  have hf_smooth : ContDiff ℝ 2 f := by
 421    apply (hSmooth.add contDiff_const).sub Real.contDiff_cosh
 422
 423  have hf_is_zero := Cost.ode_zero_uniqueness f hf_smooth hf_ode hf0 hfd0
 424
 425  -- Therefore G(t) = cosh(t) - 1
 426  have hft : f t = 0 := hf_is_zero t
 427  simp only [f, y, sub_eq_zero] at hft
 428  linarith
 429
 430/-! ## The Differentiation Key Lemma -/
 431
 432/-- For a differentiable even function G, the derivative at 0 is zero.
 433
 434    **Proof**: From G(-t) = G(t), differentiating via chain rule gives -G'(-t) = G'(t).
 435    At t = 0: -G'(0) = G'(0), hence 2G'(0) = 0, so G'(0) = 0.
 436
 437    This is a standard calculus result. -/

used by (2)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (23)

Lean names referenced from this declaration's body.