📰 Show HN: AI memory with biological decay (52% recall)

AI领域重要动态 – 第1条新闻

  • Most RAG setups fail because they treat memory like a static filing cabinet. When every transient bug fix or abandoned rule is stored forever, the context window eventually chokes on noise, spiking token costs and degrading the agent’s reasoning.
  • This implementation experiments with a biological approach by using the Ebbinghaus forgetting curve to manage context as a living substrate. Memories are assigned a “strength” score where each recall reinforces the data and flattens its decay curve (spaced repetition), while unused data eventually hits a threshold and is pruned.
  • To solve the “logical neighbor” problem where semantic search misses relevant but non-similar nodes, a graph layer is layered over the vector store. Benchmarked against the LoCoMo dataset, this reached 52% Recall@5, nearly double the accuracy of stateless vector stores, while cutting token waste by roughly 84%.
  • Built as a local first MCP server using DuckDB, the hypothesis is that for agents handling long-running projects, “what to forget” is just as critical as “what to remember.” I’d be interested to hear if others are exploring non-linear decay or similar biological constraints for context management.

原文链接:https://github.com/sachitrafa/YourMemory

发布时间:2026年4月27日 上午8:00

来源:Hacker News

注:本文内容为AI自动翻译和整理,仅供参考。


原文链接:https://github.com/sachitrafa/YourMemory

🕐 发布于: 2026年04月27日 08:01

评论

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注