pith. machine review for the scientific record. sign in
theorem proved tactic proof high

even_deriv_at_zero

show as:
view Lean formalization →

Even functions from the reals to the reals that are differentiable at zero have vanishing first derivative there. Workers on the d'Alembert equation in the Recognition Science cost model cite the result to confirm the first-order symmetry condition before classifying solutions. The argument composes the function with negation, applies the chain rule to the composition, and cancels the resulting expressions via algebraic simplification.

claimIf $H : ℝ → ℝ$ is even and differentiable at $0$, then $H'(0) = 0$.

background

The Cost.FunctionalEquation module supplies helper lemmas for the T5 cost uniqueness proof. The identifier H denotes the shifted cost reparametrization H_F t = G_F t + 1; under this shift the Recognition Composition Law becomes the standard d'Alembert equation H(t+u) + H(t-u) = 2 H(t) H(u). Upstream results such as h_even in JCostConvexityInLogSpace establish the required evenness for the underlying J-cost in log space, while CostAlgebra.H records the explicit form H(x) = J(x) + 1 = ½(x + x⁻¹).

proof idea

The proof introduces the negation map negFun x = -x. Evenness yields H ∘ negFun = H, so the derivatives at zero coincide. The chain rule then gives deriv(H ∘ negFun) 0 = deriv H 0 · (-1). Equating the two expressions for the derivative and solving the linear relation produces the claim. The steps invoke differentiability of negation and the chain-rule lemma deriv_comp.

why it matters in Recognition Science

The lemma is used by dAlembert_classification, dAlembert_cosh_solution, and dAlembert_cosh_solution_aczel. It supplies the vanishing first derivative at zero that, together with the second-derivative normalization H''(0) = 1, selects the cosh solution among the Aczél-classified candidates. In the framework it supports the identification of the J-cost with cosh(log x) - 1 at T5 of the forcing chain.

scope and limits

formal statement (Lean)

  76theorem even_deriv_at_zero (H : ℝ → ℝ)
  77  (h_even : Function.Even H) (h_diff : DifferentiableAt ℝ H 0) : deriv H 0 = 0 := by

proof body

Tactic-mode proof.

  78  -- For even functions, the derivative at 0 is 0
  79  let negFun : ℝ → ℝ := fun x => -x
  80  have h1 : deriv H 0 = deriv (H ∘ negFun) 0 := by
  81    congr 1
  82    ext x
  83    simp only [Function.comp_apply, negFun]
  84    exact (h_even x).symm
  85  have h2 : deriv (H ∘ negFun) 0 = -deriv H 0 := by
  86    have hd : DifferentiableAt ℝ negFun 0 := differentiable_neg.differentiableAt
  87    have h_diff_neg : DifferentiableAt ℝ H (negFun 0) := by simp [negFun]; exact h_diff
  88    have hchain := deriv_comp (x := (0 : ℝ)) h_diff_neg hd
  89    rw [hchain]
  90    simp only [negFun, neg_zero]
  91    have hdn : deriv negFun 0 = -1 := congrFun deriv_neg' 0
  92    rw [hdn]
  93    ring
  94  rw [h1] at h2
  95  linarith
  96

used by (8)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (8)

Lean names referenced from this declaration's body.