← All Papers · Machine Learning

Spectral Memory and Graph Routing for Language Model Agents

Tamás Nagy, Ph.D. Updated 2026-03-16 Short Draft Machine Learning
Unreviewed draft. This paper has not been human-reviewed. Mathematical claims may be unverified. Use with appropriate caution.
Download PDF View in Graph BibTeX

Abstract

LLM agents retrieve context via flat embedding similarity: the query is embedded, the \(k\) nearest neighbors are returned, and relevance decays with cosine distance. This approach ignores structure: which claims support which, which topics form coherent clusters, where the gaps are, and what timescale each memory operates on. We introduce spectral memory routing: episodic memory is decomposed into spectral modes via eigenanalysis of the memory similarity kernel, and a claim/task graph is analyzed via its Laplacian spectrum. The result is a retrieval system that (1) compresses \(N\) memory items into \(K \ll N\) mode centroids without information loss above a certified threshold, (2) identifies structural bridges and frontiers in the knowledge graph that flat similarity cannot detect, (3) provides built-in regime-switch detection when the dominant memory mode changes, and (4) reduces context window usage by \(3\)-\(5\times\) at equal retrieval precision. We formalize the method, prove compression optimality guarantees, and describe experiments on the Nous session store and Lean proof dependency graph.

Length
2,068 words
Claims
2 theorems
Status
Draft
Target
ICML / NeurIPS

Connects To

Spectral Regime Detection: Change-Point Identification via E... Spectral of Spectrals: Second-Order Mode Decomposition for C... The Spectral Cognitive Resonator: A Dynamic Architecture for... Spectral Knowledge Distillation: From Black Box to Certified...

Browse all Machine Learning papers →