From ef083332b7da546c38802fae87b80c7657130b0b Mon Sep 17 00:00:00 2001 From: courtlandleer Date: Tue, 12 Dec 2023 20:01:51 -0500 Subject: [PATCH] metacog --- .../Metacognition in LLMs is inference about inference.md | 3 +++ 1 file changed, 3 insertions(+) create mode 100644 content/notes/Metacognition in LLMs is inference about inference.md diff --git a/content/notes/Metacognition in LLMs is inference about inference.md b/content/notes/Metacognition in LLMs is inference about inference.md new file mode 100644 index 000000000..a133d1abb --- /dev/null +++ b/content/notes/Metacognition in LLMs is inference about inference.md @@ -0,0 +1,3 @@ +For wetware, metacognition is typically defined as "thinking about thinking" or often a catch-all for any "higher-order" cognition. In some more specific domains, it's an introspective process, focused on thinking about your own thinking. + +In large language models, the synthetic corollary of cognition is inference. So we can reasonably call a metacognitive process in an LLM as any that runs inference on the result of prior inference. That is, inference itself is used as context. It might be instantly funneled into the next prompt, stored for later use, or leveraged by another model. Experiments here will be critical to overcome the machine learning community's fixation of task completion (see [[The machine learning industry is too focused on general task performance]]). \ No newline at end of file