IndisputableMonolith.Complexity.JCostLaplacian
The JCostLaplacian module supplies the discrete graph structure and weights for the J-cost landscape on Boolean assignments. It defines bit-flip operations and edge weights that model local changes in the Recognition Science operator R̂ for SAT instances. Complexity theorists analyzing spectral gaps or J-frustration cite these definitions to bound circuit size via topological obstructions. The module contains only definitions and elementary lemmas with no deductive proofs.
claimLet $A$ be an assignment in ${0,1}^n$. Define the neighbor map by single-bit flip: $flip(A,i)$ inverts the $i$-th coordinate. The edge weight between $A$ and $flip(A,i)$ is $jcostEdgeWeight(A,i) = J(A) - J(flip(A,i))$ (nonnegative and symmetric). The J-cost Laplacian is the weighted graph operator whose quadratic form sums these differences over all edges.
background
The module sits inside the Complexity domain and imports the RSatEncoding (which encodes SAT formulas with the J-cost function derived from the Recognition Composition Law) together with the base time quantum from Constants. Key objects include the flipBit operation that inverts one coordinate of an assignment, the jcostEdgeWeight that assigns a nonnegative symmetric cost to each hypercube edge, and auxiliary predicates such as containsVar and varDegree that track variable occurrence. These build the weighted graph on which the J-cost landscape is defined, with vertices as assignments and edges corresponding to single-bit changes.
proof idea
This is a definition module, no proofs. It introduces the flipBit map, proves its basic involution and symmetry properties by direct computation, defines jcostEdgeWeight via the J-cost difference, and records nonnegativity and symmetry as immediate consequences of the underlying J properties imported from RSatEncoding.
why it matters in Recognition Science
The definitions feed directly into SpectralGap (where the Laplacian controls convergence rate of R̂ gradient descent), JFrustration (which measures topological depth of the J-cost barrier), CircuitLowerBound (high frustration implies superpolynomial circuit size), NonNaturalness (J-frustration evades natural-proof barriers), and PvsNPAssembly (the conditional resolution path for P ≠ NP). It supplies the concrete graph model required for the eight-tick octave and phi-ladder analysis of complexity in the Recognition Science framework.
scope and limits
- Does not prove spectral gap bounds or eigenvalue estimates.
- Does not connect J-cost weights to physical constants or the phi ladder.
- Does not treat continuous or infinite-dimensional limits of the Laplacian.
- Does not encode or solve specific SAT instances.
used by (6)
depends on (2)
declarations in this module (23)
-
def
flipBit -
theorem
flipBit_flipBit -
theorem
flipBit_ne -
def
jcostEdgeWeight -
theorem
jcostEdgeWeight_nonneg -
theorem
jcostEdgeWeight_symm -
theorem
jcostEdgeWeight_le_clauses -
def
containsVar -
def
varDegree -
theorem
flipBit_other -
theorem
literal_unchanged_by_flip -
theorem
clause_unchanged_by_flip -
def
jcostEdgeWeight_le_varDegree_prop -
def
jcostDegree -
theorem
jcostDegree_nonneg -
def
JCostLaplacianForm -
theorem
laplacian_form_nonneg -
theorem
laplacian_form_const_zero -
theorem
laplacian_form_zero_imp -
inductive
CostConnected -
theorem
costConnected_refl -
structure
JCostLaplacianCert -
def
jcostLaplacianCert