hasDerivAt_costAlphaLog
Researchers working on the WLOG α=1 reduction in the Recognition Science cost functions cite this lemma to obtain the explicit derivative of the rescaled cost in logarithmic coordinates. It asserts that G_α(t) = (1/α²)(cosh(α t) - 1) is differentiable at every real t (α ≠ 0) with derivative sinh(α t)/α. The proof is a short term-mode calculation that composes the derivative of cosh with the linear map s ↦ α s, subtracts the constant 1, multiplies by the prefactor 1/α², and simplifies the resulting expression with zero_mul and zero_add.
claimLet $G_α(t) = (1/α²)(cosh(α t) - 1)$ for fixed $α ≠ 0$. Then $G_α$ is differentiable at every real $t$ with derivative $sinh(α t)/α$.
background
The module WLOGAlphaOne reparametrizes the calibrated cost family in logarithmic coordinates. After the calibration κ(F)=1 forces c=2α², the α-parameterized cost becomes G_α(t)=(1/α²)(cosh(α t)-1), the image of the multiplicative form F_α(x)=(1/α²)(cosh(α ln x)-1) under t=ln x. The module shows that the map x ↦ x^α is a group automorphism of (ℝ₊,·), so α merely rescales the multiplicative coordinate and the unit-curvature condition holds uniformly; this justifies the reduction to α=1 without loss of generality.
proof idea
The proof applies the chain rule to obtain the derivative of cosh(α s) at s=t by composing hasDerivAt_cosh with hasDerivAt_alpha_mul. It subtracts the constant 1, multiplies the result by the constant function 1/α² (whose derivative is zero), and simplifies the product using zero_mul and zero_add. Unfolding the definition of CostAlphaLog and converting the equality yields the stated derivative.
why it matters in Recognition Science
This lemma is invoked directly by the sibling result deriv_costAlphaLog_eq to extract the explicit derivative function. It fills the analytic step of the module's coordinate-rescaling proposition, confirming that the derivative form is compatible with the J-cost representation J(x)=(x+x⁻¹)/2-1 after the change of variables. In the broader framework it supports the T5 J-uniqueness and T6 phi fixed-point steps by supplying the concrete derivative needed for the eight-tick octave and D=3 constructions.
scope and limits
- Does not establish differentiability when α=0.
- Does not compute second or higher derivatives.
- Does not translate the derivative back to the multiplicative coordinate x.
- Does not invoke the Recognition Composition Law or phi-ladder mass formula.
formal statement (Lean)
89private lemma hasDerivAt_costAlphaLog (α : ℝ) (hα : α ≠ 0) (t : ℝ) :
90 HasDerivAt (CostAlphaLog α) (sinh (α * t) / α) t := by
proof body
Term-mode proof.
91 have h1 : HasDerivAt (fun s => cosh (α * s)) (sinh (α * t) * α) t :=
92 (hasDerivAt_cosh (α * t)).comp t (hasDerivAt_alpha_mul α t)
93 have h2 : HasDerivAt (fun s => cosh (α * s) - 1) (sinh (α * t) * α) t :=
94 h1.sub_const 1
95 have h_const : HasDerivAt (fun _ : ℝ => (1 / α ^ 2 : ℝ)) 0 t :=
96 hasDerivAt_const t (1 / α ^ 2)
97 have h3 := h_const.mul h2
98 simp only [zero_mul, zero_add] at h3
99 unfold CostAlphaLog
100 convert h3 using 1
101 field_simp
102