In a recent podcast episode, the host of "Eddy Says Hi" explored how AI agents can overcome their notorious "goldfish memory" using LLM Wiki v2 and AgentMemory. LLM Wiki v2 is a major upgrade to Andrej Karpathy's original concept, designed for production-scale AI memory. The discussion covers the memory lifecycle, emphasizing that treating all data equally leads to digital rot. Confidence scoring, supersession, and Ebbinghaus-inspired forgetting curves keep the AI's knowledge base sharp and relevant. The episode also highlights the shift from flat markdown pages to typed knowledge graphs, where relationships like "depends on" or "contradicts" enable precise navigation. For scaling to thousands of pages, Hybrid Search combines BM25 keyword matching, vector embeddings, and graph traversal, creating a context-aware search engine. Additionally, the concept of "crystallization" transforms raw sessions into high-value structured facts automatically.
How LLM Wiki v2 and AgentMemory Give AI Agents Long-Term Memory
AI
April 29, 2026 · 11:15 AM