Fin Progressive Spectral
Abstract
We formalize the Progressive Spectral Decomposition (PSD): computing eigenvalues of a symmetric matrix one at a time via deflation, with a combination of properties no existing method provides simultaneously. PSD is (1) streaming: every intermediate rank-\(K\) result is immediately usable; (2) exact: the rank-\(K\) progressive result is bit-for-bit identical to the rank-\(K\) truncation from full eigendecomposition; and (3) provably optimal: each intermediate result minimizes \(\|A - B\|_F\) over all rank-\(\leq K\) matrices (Eckart-Young). We prove these properties, along with monotone error decrease, gap-dependent convergence, warm-start acceleration, computable stopping criteria, streaming rank-1 updates, and portfolio risk preservation, in Lean 4 (12 files, 0 sorry). Benchmarks demonstrate 2--6\(\times\) warm-start acceleration on financial correlation matrices (\(n \leq 1{,}000\)), and scalability to \(n = 500{,}000\) via implicit factor model representations. A Lanczos-based implementation (v2) further achieves 2.5\(\times\) speedup over power iteration at \(n = 500{,}000\), delivering the first actionable eigenvalue in 1.5 seconds. The key insight is not algorithmic novelty — deflation is classical — but a new computing paradigm: like approaching a landscape from a distance, every intermediate result reveals the truth at that resolution, and walking closer never revises what came before.
Keywords: eigenvalue decomposition, deflation, Eckart-Young, progressive computation, real-time risk, streaming algorithms, formal verification, Lean 4