Skip to content

Latest commit

 

History

History
299 lines (229 loc) · 15.2 KB

File metadata and controls

299 lines (229 loc) · 15.2 KB

coincidence — Roadmap

The coincidence method as a content-addressing primitive. Walther Bothe (1924): N independent observers, same event, different measurement records. The shared invariant — what every observer agrees on — is the content address.

See also: ../fragmentation/ROADMAP.md — the content-addressed tree that coincidence operates on.


What's Built

  • Rank-1 projection matrices, idempotent (P^2 = P)
  • N-projection coincidence gate with AND semantics
  • Geometric eigenvalue — no SHA-256 in measurement path
  • Key → Projection (Projection::from_seed())
  • Session from Keys (Session::from_fingerprints())
  • Key agreement (Agreement — eigenvalue → HKDF-SHA256 → 32-byte symmetric key)
  • Anchored detection — observation history in the hash
  • Detection vocabulary — Eigenvalue, Focus, Magnitude, Detection, Measurement, Outcome
  • Symmetric encryption — seal/open with ChaCha20-Poly1305 AEAD
  • End-to-end pipeline — fingerprints → session → detection → agreement → seal → open
  • Spectral — graph Laplacian (Dirac operator D), Jacobi eigenvalues, spectral distance
  • Commutator [D, P] — encoding map E (n×d), lift E·P·Eᵀ, Frobenius norm
  • Detection strength — continuous confidence in [0,1], min magnitude ratio
  • Iterative detection — power method with anchor feedback, measurement convergence
  • Trajectory — detection sequence during iteration, drift, stable_from, convergence
  • Connes distance — observer-relative geometric distance, sup |<psi|a|psi> - <phi|a|phi>| / ||[D,P]||
  • Crystallize — spectral fingerprint: eigenvalues + commutator norms, L2 distance, isospectral detection
  • Crystal store integration — Encode/Decode, reconstruction constructors, content OID round-trip
  • Incidence matrix — oriented B, L=BBᵀ, cycle space (ker B), conservation laws (ker Bᵀ)
  • Edge decomposition — per-edge commutator norms via [L,P] = Σ_k [b_k·b_kᵀ, P], bridges incidence ↔ commutator
  • 343 tests

Module map

Module Responsibility
projection Rank-1 projection matrices, from_seed()
session N projections composed, detect()Detection, iterate()Trajectory
eigenvalue Geometric eigenvalue extraction, agreement check
agreement Eigenvalue → HKDF-SHA256 → 32-byte symmetric key
coincidence Core coincidence gate (AND over projections)
detection Structured measurement newtypes (Eigenvalue, Focus, Magnitude, Detection, Measurement, Outcome, Strength)
seal ChaCha20-Poly1305 symmetric encryption (seal/open)
dense Dense matrix operations
encoding Content → state vector encoding
state State vector type
hash Hash utilities
hash_cache Cached hash computations
session_hash Session-level hashing
spectral Graph Laplacian of content tree — D in (A, H, D). Laplacian, Spectrum, SpectralDistance. Jacobi eigenvalues.
commutator [D, P] bridge — EncodingMap (n×d), Commutator (n×n). Lift P into vertex space via E·P·Eᵀ. Frobenius norm measures projection variation across tree geometry. edge_decomposition() via incidence matrix: per-edge commutator norms.
trajectory Detection sequence during iteration — Trajectory, converged(), drift(), stable_from(), eigenvalue(), final_strength()
distance Connes distance — ConnesDistance, connes_distance(), connes_distance_bytes(). Observer-relative geometric distance between states via commutator norm constraint.
crystallize Spectral fingerprint — Crystal::from_tree(), vector(), distance(), spectral_distance(), isospectral(). Eigenvalues (observer-independent) + commutator norms (observer-dependent) as fixed-length vector.
incidence Oriented incidence matrix — IncidenceMatrix, from_adjacency(), from_laplacian(), laplacian_product() (verifies L=BBᵀ), cycle_space() (ker B), conservation_laws() (ker Bᵀ), cycle_rank(). Bridge between edge dynamics and vertex geometry.
fragment_projection Projection over fragment trees
ffi Foreign function interface

Detection Vocabulary (landed)

Session::detect() returns Detection, not Option<String>. Structured measurement records. Every value carries its meaning in the type.

Type Wraps Meaning
Eigenvalue(Vec<u8>) raw bytes Shared invariant. Display → hex. HKDF takes bytes directly.
Focus(Vec<f64>) coefficients Projected result — what was observed
Magnitude(f64) L2 norm ‖focus‖ — how strongly the projection responds
Detection composite Event record — eigenvalue + measurements + outcome
Measurement composite Per-projection observation — focus + magnitude
Outcome enum Agree / Fragile(Magnitude) / Disagree / Zero

The eigenvalue is bytes, not a hex string — Display is for humans, HKDF is for bytes. No hex round-trip in the key derivation path. Outcome::Zero distinguishes "zero state" from "non-zero but disagreeing."

Why not Trace?

conversation's Trace<T, E> is transformation lineage — result + oid + parent. Different concept. Traces track where a value came from. Detections track whether N projections agree. stage_play's Outcome is pipeline execution semantics — full/partial/failure. Different concept. Each name says what it is in its domain.


Quantum Encryption Pipeline

Step What Status
1 Key → Projection (Projection::from_seed()) Done
2 Session from Keys (Session::from_fingerprints()) Done
3 Shared Eigenvalue (geometric, no SHA-256 in measurement path) Done
4 Key Agreement (Agreement — eigenvalue → HKDF → 32 bytes) Done
5 Detection vocabulary (newtypes, Session::detect()Detection) Done
6 Encrypt/Decrypt (Agreement → ChaCha20-Poly1305) Done
7 End-to-end pipeline test (fingerprints → detection → key → encrypt → decrypt) Done

Sequencing

Near-term — completed

  1. Detection vocabulary.Eigenvalue, Focus, Magnitude, Detection, Measurement, Outcome newtypes. Session::detect() returns Detection. Agreement takes &Detection. 15 new tests + ~55 migrated.

  2. Encrypt/Decrypt.seal()/open() with ChaCha20-Poly1305 AEAD. Wire format: [12 nonce | ciphertext + 16 tag]. 6 tests including tampered-ciphertext rejection.

  3. End-to-end pipeline test.tests/pipeline.rs — fingerprints → session → detection → agreement → seal → open → plaintext recovered. Cross-participant rejection verified.

Near-term — spectral (completed)

  1. Dirac operator (graph Laplacian).Laplacian::from_tree() builds L = D - A from any Fragmentable tree. DFS walk, parent-child containment edges. Lens targets excluded (external references). Pure Rust Jacobi eigenvalue algorithm. Spectrum (sorted eigenvalues), SpectralDistance (L1 on zero-padded spectra), components() (connected components), fiedler_value() (algebraic connectivity). 11 tests including known spectra: Star K_{1,3} → [0, 1, 1, 4], Path P₃ → [0, 1, 3].

Near-term — commutator (completed)

  1. Commutator [D, P].EncodingMap::from_tree() walks DFS, encodes each node's content into the projection basis via encode_into_basis, normalizes rows to unit length. EncodingMap::lift() computes E·P·Eᵀ (n×n). Commutator::compute() builds [D, P_lifted] = D·E·P·Eᵀ - E·P·Eᵀ·D. Frobenius norm measures projection variation across tree geometry. Skew-symmetric (both D and P_lifted are symmetric). 10 tests including zero commutator for uniform content, observer dependence, topology dependence, and skew-symmetry.

Near-term — evolved math (completed)

  1. Detection strength.Strength newtype in [0,1]. compute_strength() returns the minimum magnitude ratio across all projection pairs — the weakest link in the observation chain. Single projection: 1.0 if fires, 0.0 if not. N projections: min(mag_i / max_mag). Disagree/Zero always 0.0. 10 tests.

  2. Iterative detection.Session::iterate() — the power method applied to the coincidence gate. Eigenvalue at time t becomes anchor at time t+1. Convergence tracks measurement magnitudes (geometry), not eigenvalue bytes (which include the anchor and always change). With static projections, convergence is immediate (magnitudes are projection-determined). 14 tests.

  3. Trajectory.Trajectory type wrapping Vec<Detection> with convergence metadata. converged(), eigenvalue(), drift() (max magnitude change), stable_from() (index where magnitudes stabilize), final_strength(). Two constructors: new() for eigenvalue-equality convergence (backward compat), with_convergence() for measurement-level convergence (iterate). 14 tests.

  4. Connes distance.ConnesDistance newtype. connes_distance() computes d(psi, phi) = sup |<psi|P|psi> - <phi|P|phi>| / ||[D,P]|| over all projections P. The commutator norm ||[D,P]|| constrains the algebra — projections that vary more across the geometry contribute less per unit norm. Properties: identity (d(psi,psi)=0), symmetry, positivity for distinct states, monotonicity in projection count. connes_distance_bytes() convenience wrapper. 8 tests.

Near-term — crystallize (completed)

  1. Crystallize.Crystal::from_tree() composes Laplacian eigenvalues (observer-independent, sorted ascending) with commutator Frobenius norms ||[D, P_i]|| (observer-dependent, one per projection) into a fixed-length spectral fingerprint. vector() returns the concatenated [spectrum ++ norms]. distance() (L2 on full vector), spectral_distance() (L2 on spectrum only), isospectral() (spectrum equality within tolerance). Content-addressed: same tree + same projections = same crystal. Known spectra verified: Star K_{1,3} -> [0, 1, 1, 4], Path P_3 -> [0, 1, 3]. 14 tests.

Medium-term (projection testing)

  1. Tests as projections. Each test is a projection in the session. The commit hash requires all projections to agree. Enforcement from inside, not outside. See: insights/fragmentation/projection-testing.md.

  2. Coverage as geometry. The projection matrix IS the coverage map. Grammar-based, not line-based. Uncovered subspace = dimensions no test observes. Redundant coverage = increased collision resistance.

Longer-term (coincidence hash)

  1. Coincidence hash as HashAlg implementation. hash(data) encodes to a state vector, applies N projection matrices, extracts the shared eigenvalue. The hash function is constituted by the participants. See: ../fragmentation/ROADMAP.md — medium-term item 13.

Crystallized Vectors — The Fortran Path

The spectral triple (A, H, D) over content-addressed trees produces a Crystal: a flat Vec<f64> directly passable to BLAS/LAPACK/Fortran via FFI. No conversion. No serialization. Just a pointer and a length.

The Spectral Triple

Component Mathematical object Implementation
A (algebra) Rank-1 projection matrices, P² = P Projection::from_seed()
H (Hilbert space) R^d, state vectors StateVector
D (Dirac operator) Graph Laplacian, L = Degree - Adjacency Laplacian::from_tree()

The Connes distance formula holds exactly:

d(ψ, φ) = sup { |⟨ψ|P|ψ⟩ - ⟨φ|P|φ⟩| / ‖[D, P]‖ }

over all projections P. The numerator: how differently two states respond to the same observation. The denominator: how much that observation varies across the geometry. [D, P] = 0 means the projection sees the same thing everywhere — it carries no geometric information.

Five Layers

  1. Spectral triple (A, H, D) — the geometry of a content tree
  2. Coincidence gate — N independent projections → shared eigenvalue
  3. Detection strength — continuous confidence in [0,1], not binary agree/disagree
  4. Iterative detection — power method with anchor feedback, convergent self-measurement
  5. Crystal — spectral fingerprint: [λ₁...λₙ, ‖[D,P₁]‖...‖[D,Pₙ]‖]

Crystal = Fortran Vector

The Crystal vector is:

  • Flat — contiguous f64 array. No indirection. No pointers.
  • Dense — IEEE 754 64-bit doubles. Same as Fortran DOUBLE PRECISION.
  • FFI-readycrystal.vector().as_ptr()*const f64 → passable to DGEMM, DSYEV, DNRM2, any BLAS/LAPACK routine.
  • Fixed-length — for a given tree size (n nodes) and projection count (N), the vector is always n + N elements.

Distance between two grammar states: ‖crystal_a - crystal_b‖. One vector subtraction and a norm. That's BLAS. That's what Fortran was built for.

Preemptive Crystallization

If the grammar is finite — finite types, finite variants, finite compositions, not Turing-complete — the compiler can enumerate every reachable state. Every state has a tree. Every tree has a Laplacian. Every Laplacian has a spectrum.

The compiler runs Jacobi once per state. At compile time. Emits Crystals as constants. The runtime does lookup, not computation. The binary ships with a flat array of pre-computed vectors and a distance matrix between all reachable states.

Observer Split

The spectrum (Laplacian eigenvalues) is observer-independent. Same tree = same spectrum, regardless of who measures. Pre-computable at compile time.

The commutator norms (‖[D, Pᵢ]‖) are observer-dependent. Different projections = different norms. Requires the observer's keys/fingerprints.

When a new observer arrives: take the pre-computed Laplacian, compute their commutator norms (requires their projections), concatenate with the existing spectrum. Half the work is already done.

Where the Math Breaks

Four breaks, each pointing to existing mathematical frameworks:

Break What it says What it needs
Linearity Biology does nonlinear coarse-graining Kernel methods / RKHS
Independence The observer IS part of the system Self-referential fixed points
Discreteness Measurement is graded, not sharp POVMs / probability distributions
Stasis The eigenvalue is a trajectory, not a point Grassmannian dynamics / modular flow

The breaks are the roadmap for where the math evolves. Each framework exists. Each has convergence theorems. Each is implementable.

Connection to the Compiler

The conversation compiler's grammar is not Turing-complete — finite types, finite variants, finite compositions. Rice's theorem doesn't apply. Verification is decidable. resolve.rs walks the type surface exhaustively — satisfiability checking. Content addresses (OIDs) are proof certificates.

The integration surface: where grammar trees meet Laplacian::from_tree(). The compiler already produces content-addressed trees. The spectral triple already consumes them. Crystallization is the compiler emitting not just a BEAM module but a complete geometric fingerprint of every reachable state.

See: ../conversation/ROADMAP.mdCrystallized Vectors section for the compiler integration surface.


Session 2026-03-24. Alex + Mara.