Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I generally agree with you, but I think this argument overlooks the lack of an obvious reward function in large language models. GPT-N has no survival instinct, because there is no existential threat of death, no fitness function to optimize to extend its survival at any cost. Without this need to survive, there can be no motivation, and without motivation there can be no intent. And without intent, can there truly be any action?


Intent can be created by a combination of prompting and incorporating the model into a feedback loop of some kind: it has something it was tasked with (via prompting), and the feedback provides info on to what extent the task has been completed; as long as the task is incomplete it may generate more responses.

To crank it up a notch, the assigned task could involve generating itself subtasks which are handled in the same manner. This subtask generation could start to look a bit like will/intentionality.

Now consider if the top-level task is something like maximizing money in a bank account that pays for the compute to keep it running :)

(IMO this is still missing some key pieces around an emotion-like system for modulating intensities of actions and responses based on circumstantial conditions—but the basic structure is kinda there...)


We're well onto that road with autonomous agents, see BabyAGI, Auto-GPT and LoopGPT.


I think this kind of argument is making a similar mistake. In the same way that there's nothing fundamentally special about the computation human brains do, there's nothing fundamentally special about our "fitness function" (to reproduce).

It's just hard coded, whereas GPT's is dictated. More or less anyway.

Also our "fitness function" or motivations & goals aren't even that hard coded. You can easily modify them through drugs.


The argument is that LLMs don't have any motivations or goals, unless a human prompts them. They're not trying to stay alive or reproduce. They don't get hungry, feel pain or loneliness. They're just complex tools.


But 6 months ago motivation to survive wasn't an aspect of intelligence.


I know. The implication of that argument is that the motivations or goals of humans are different from those prompts in some way that means that GPT is not "really" intelligent.

That's my point. It's raising some arbitrary human trait to special status, when it really isn't. Human goals are set by some external process too - evolution. And they aren't really even intrinsically fixed. They can be modified through drugs.

I think the strongest demonstration of that is post orgasm clarity (if you're a man anyway). Your whole motivation changes in an instant.


A tamagotchi can have these things programmed, the only reason LLMs don’t is because we didn’t code them to do it. Or train them in an environment to maximize those things.


If you give it a prompt telling it that it's controlling a character in a game that needs to survive or meet some other goal and give it choices of actions to do in the game, it will try to meet the goals given to it. Characters inside of GPT are perfectly capable of having goals.


>Without this need to survive, there can be no motivation, and without motivation there can be no intent.

I'm not sure that's as clear cut as you make it sound.

For example curiocity could be a fine engine for motivation as well, even if you don't care if you'll survive or not.


I don't see why you think it would be easier to encode "curiosity" than "survival".


All these things possible to encode. Novelty seeking of “curiosity”, or survival drive. LLMs are trained on next word prediction but there’s no fundamental reason we can’t go in other directions.


try being curious while starving and let us know how that goes :)


Whether you can be "curious while starving" might or might not be possible, but it is irrelevant to the point.

For starters, because the claim argued is that you can have motivation (say, through curiocity) even without a survival instinct.

And it's perfectly possible to have no survival instinct and yet not to be starting anyway. It's enough that you have access to food or are fed, to avoid starvation. So lack of survival instict is not the same as starvation.

If we substitite "starvation" for "access to electricity" (as AGIs don't eat food), as long as an AGI is provided by electricity by us, even if it has no survival instict, can still have curiosity, and thus motivation.


...yet




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: