Compressed External Latent State (CELS)

tl;dr: We find that by emulating a language model’s internal latent space, represented as a graph, we achieve 99% fewer tokens while retaining the level of semantic meaning. What Latent Space Actually Is Latent space is the hidden geometry where modern language models actually think. Inside every transformer layer, the model continuously compresses meaning into […]