> As AI edges humans out of the business of thinking, I think we need to be wary of losing something essential about being human
If AI edges humans out of the business of thinking, then we're all in deep shit, because it doesn't think, it just regurgitates previous human thinking. With no humans thinking, no advances in code will be possible. It will only be possible to write things which are derivatives of prior work
(cue someone arguing with me that everything humans do is a derivative of prior work)
> For 99% of tasks I'm totally certain there's people out there that are orders of magnitude better at them than me.
And LLMs slurped some of those together with the output of thousands of people who’d do the task worse, and you have no way of forcing it to be the good one every time.
> If the AI can regurgitate their thinking, my output is better.
But it can’t. Not definitively and consistently, so that hypothetical is about as meaningful as “if I had a magic wand to end world hunger, I’d use it”.
> Humans may not need to think to just... do stuff.
If you don’t think to do regular things, you won’t be able to think to do advanced things. It’s akin to any muscle; you don’t use it, it atrophies.
> And LLMs slurped some of those together with the output of thousands of people who’d do the task worse, and you have no way of forcing it to be the good one every time.
That's solvable though, whether through changing training data or RL.
> And LLMs slurped some of those together with the output of thousands of people who’d do the task worse
Theoretically fixable, then.
> But it can’t. Not definitively and consistently
Again, it can't, yet, but with better training data I don't see a fundamental impossibility here. The comparison with any magic wand is, in my opinion, disingenuous.
> If you don’t think to do regular things, you won’t be able to think to do advanced things
Humans already don't think for a myriad of critical jobs. Once expertise is achieved on a particular task, it becomes mostly mechanical.
-
Again, I agree with the original comment I was answering to in essence. I do think AI will make us dumber overall, and I sort of wish it was never invented.
But it was. And, being realistic, I will try to extract as much positive value from it as possible instead of discounting it wholly.
Your comment is nonsensical. Have you ever used any LLM?
Ask the LLM to... I don't know, to explain to you the chemistry of aluminium oxides.
Do you really think the average human will even get remotely close to the knowledge an LLM will return to such a simple question?
Ask an LLM to amend a commit. Ask it to initialize a rails project. Have it look at a piece of C code and figure out if there are any off-by-one errors.
Then try the same to a few random people on the street.
If you think the knowledge stored in the LLM weights for any of these questions is that of the average person I don't even know what to say. You must live in some secluded community of savant polymaths.
I was thinking about this after watching YouTube short verticals for about 2 hours last night: ~2min clips from different TV series, movies, SNL skits, music insider clips (Robert Trujillo auditions for Metallica, 2003. LOL). My friends and I often relate in regurgitated human sound bites. Which is fine when I’m sitting with friends driving to a concert. Just wasting time.
I’m thinking about this time suck, and my continual return/revisiting to my favorite hard topics in philosophy over and over. It’s certainly what we humans do. If I think deeply and critically about something, it’s from the perspective of a foundation I made for myself from reading and writing, or it was initialized by a professor and coursework.
Isn’t it all regurgitated thinking all the way down?
Creative thinking requires an intent to be creative. Yes, it may be a delusion to imagine oneself as creative, one's thoughts to be original, but you have to begin with that idea if you're going to have any chance of actually advancing human knowledge. And the stronger wider higher you build your foundation-- your knowledge and familiarity with the works of humans before your time-- the better your chance of successful creativity, true originality, immortality.
Einstein thinks nothing of import without first consuming Newton and Galileo. While standing on their shoulders, he could begin to imagine another perspective, a creative reimaging of our physical universe. I'm fairly sure that for him, like for so many others, it began as a playful, creative thought stream, a What If juxtoposition between what was known and the mystery of unexplored ideas.
Your intent to create will make you creative. Entertain yourself and your thoughts, and share when you dare, and maybe we'll know if you're regurgitating or creating. But remember that you're the first judge and gatekeeper, and the first question is always, are you creative?
> If AI edges humans out of the business of thinking, then we're all in deep shit
Also because we live under capitalism, and you need something people need you to do to be allowed to live.
For a century+, "thinking" was the task that was supposed to be left to humans, as physical labor was automated. If "AI edges humans out of the business of thinking" what's left for humans, especially those who still need to work for a living because they don't have massive piles of money.
If AI edges humans out of the business of thinking
This will never happen because the business of thinking is enjoyable and the humans whose thinking matters most will continue to be intrinsically motivated to do it.
> This will never happen because the business of thinking is enjoyable and the humans whose thinking matters most will continue to be intrinsically motivated to do it.
What world do you live in, where you get paid doing the things that are enjoyable to you, because they're enjoyable?
If AI edges humans out of the business of thinking, then we're all in deep shit, because it doesn't think, it just regurgitates previous human thinking. With no humans thinking, no advances in code will be possible. It will only be possible to write things which are derivatives of prior work
(cue someone arguing with me that everything humans do is a derivative of prior work)