From ab22e1ef600c4e796e14ddc741fdbfb327c02aae Mon Sep 17 00:00:00 2001 From: vintro Date: Fri, 22 Aug 2025 09:36:16 -0400 Subject: [PATCH] fix: grammar --- content/blog/Memory as Reasoning.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/blog/Memory as Reasoning.md b/content/blog/Memory as Reasoning.md index 53b3225b6..441697c79 100644 --- a/content/blog/Memory as Reasoning.md +++ b/content/blog/Memory as Reasoning.md @@ -25,7 +25,7 @@ Common storage solutions include, but are not limited to, the following: All are useful tools but they assume you already know what’s worth storing and how to structure it. And the formation step is routinely overlooked--ask the model to extract some facts, embed them, store them, done. But once stored, those artifacts are static. The system's success relies on the search strategy aligning with whatever context was baked in during storage. -But deterministic systems are not AI-native, agents and LLMs afford us the ability to approach problems which are non-deterministic in nature. Understanding personal identity (the selfhood, personality, and psychology) of a human user or an agent entity is an AI-native problem. It deals with incomplete information and relies on making the best possible predictions about personal identity in novel situations at the time of inference. Simply storing and retrieving static data isn't sufficient for this task. It's not effective or efficient, nor can it compete in many important ways with biological systems. +But deterministic systems are not AI-native, agents and LLMs afford us the ability to approach problems which are non-deterministic in nature. Understanding personal identity (selfhood, personality, and psychology) of a human user or an agent entity is an AI-native problem. It deals with incomplete information and relies on making the best possible predictions about personal identity in novel situations at the time of inference. Simply storing and retrieving static data isn't sufficient for this task. It's not effective or efficient, nor can it compete in many important ways with biological systems. Human cognitive systems evolved under energy, information, and computation constraints. They therefore evolved elegant ways of taking incomplete data flowing over sensory organs and constructing representations or models of reality. Cognitive science tells us that the brain employs sophisticated prediction and surprisal strategies to build models under such constraints. Remembering everything with perfect fidelity just isn't realistic for a system evolving in a competitive, under-resourced environment. So memory is not simply the encoding of perfect static data about the world and surfacing it when needed. Instead, it's making predictions about the environment based on incomplete data and checking at the margins for errors thrown by sensory inputs to improve the next round of predictions. In this way, an internal model of reality is born.