You could probably argue that a model updating its parameters in real time is ideal but it’s not likely to matter. We can do that today, if we wanted to. There’s really just no incentive to do so.
This is part of what I mean by encoding emotional state. You want standard explicit state in a simple form that is not a billion dimension latent space . The interactions with that space are emergently complex. But you won’t be able to stuff it all into a context window for a real GAI agent.
This orchestration layer is the replacement for LLMs. LLMs do bear a lot of similarities to brains and a lot of dissimilarities. But people should not fixate on this because _human minds are not brains_. They are systems of many interconnected parts and hormones.
It is the system framework that we are most prominently missing. Not raw intellectual power.
This is part of what I mean by encoding emotional state. You want standard explicit state in a simple form that is not a billion dimension latent space . The interactions with that space are emergently complex. But you won’t be able to stuff it all into a context window for a real GAI agent.
This orchestration layer is the replacement for LLMs. LLMs do bear a lot of similarities to brains and a lot of dissimilarities. But people should not fixate on this because _human minds are not brains_. They are systems of many interconnected parts and hormones.
It is the system framework that we are most prominently missing. Not raw intellectual power.