A Fistful of Errors: A Discrete Formulation of Neural Network Behavior in Floating Point Representation
This bulletin presents the Lattice Hypothesis, proposing that neural networks fundamentally operate on discrete floating-point lattices rather than continuous functions on ℝⁿ. Traditional continuous analysis is reconceptualized as the approximation, with the discrete lattice structure being the computational reality.
Curiouser and Curiouser: Apparent Contradictions in Neural Network Observables
This bulletin explores the Hallway Hypothesis, examining how constraints that reduce wrong moves matter more than constraints that reduce total moves. Analysis of phenomena like LoRA effectiveness and quantization failure modes reveals the geometry of transformation corridors in neural network pipelines.
Not Even Long: An Information-Theoretic Perspective on Precision in Neural Networks
This bulletin examines the Landauer Hypothesis, treating precision not as a hyperparameter to optimize but as a physical quantity to measure. Explores how the thermodynamic cost of bit erasure constrains low-precision quantization schemes, with implications for NVFP4 deployment at the edge.