EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs github.com 34 points by jbotz 2 days ago
mountainriver a day ago TTT, cannon layers, and titans seem like a stronger approach IMO.Information needs to be compressed into latent space or it becomes computationally intractable
MacsHeadroom a day ago So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.
TTT, cannon layers, and titans seem like a stronger approach IMO.
Information needs to be compressed into latent space or it becomes computationally intractable
So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.