Doesn't that mean it supports the Chinese room argument? I'm not sure I follow your reasoning.
(also, popular conciousness forgets that technically the Chinese Room argument is only arguing against the much narrower, and now philosophically unfashionable, "Hard AI" stance as it was held in the 70s)
> the Chinese Room argument is only arguing against the much narrower, and now philosophically unfashionable, "Hard AI" stance as it was held in the 70s
Searle has stood behind his argument in the 70s, but in every decade since then too.
The main failure is that most people fundamentally don't believe they are mechanistic. If one believe in dualism, then it easy to attribute various mental states to that dualism, and of course a computer neural network cannot experience qualia like humans do.
I don't believe in a soul, and thus believe that a computer neural network, probably not today's models but a future one that is large enough and has the right recurrent topology, will be able to have qualia similar to what humans and animals experience.
Searle's argument in the Chinese Room is only that passing the Turing Test isn't enough to prove evidence of Mind (capital 'm' to distinguish it at the philosophical jargon term, and all it entails). He does hold the stance that he doesn't think Computationalism (in the style of Dennet) is correct. I'm not sure if he personally feels the Chinese Room argument refutes that stance as a whole, but I believe the general consensus is that, as originally formulated in his essay, it does not stand as a total refutation of Computationalism, without reading between the lines or squinting your eyes a bit. Searle does have a wider stance that he does not think computations can have things equivalent to mental states, especially intentionality. Obviously there is a whole separate debate to his correctness there, but I'm skipping over it to just discuss the Chinese Room
That passing the Turing Test is not enough to exhibit evidence of Mind is not that controversial today. GPT-4 could easily pass the Turing Test as it was originally formulated. There are not many out there that think it possesses conciousness or intentionality or any mental states at all really. We'd generally agree now that passing the Turing Test is only a step towards creating an actual artificial mind (how large or small a step is still up for debate).
Anyway, all this is a tangent as I still don't understand why the original commenter feels this article provides a refutation of the Chinese Room argument when it seems (to me) to reinforce it. I'm just curious on that perspective and was interested in hearing more.
I find the argument to be essentially assuming the premise (if a system acts conscious, but you look inside it and don’t find anything conscious in there, then it can’t be conscious) but honestly I wouldn’t say I’m sure I understand the argument. Given my understanding, I thought this illustrated the circularity. But I see I’m wrong, because if you buy the premise then you won’t find the situation analogous.
That is, imagine there was a famous philosopher who insisted that cats can only be recognized by some non-computational mechanism, and any computational cat recognizer might “simulate” recognizing cats but could not be said to actually recognize cats. Then you build a neural network that recognizes cats, and they open it up and point out that nothing in it can recognize cats, so therefore it isn’t “really” recognizing cats.
I understand the Chinese Room argument to be that because the human in the room doesn’t understand Chinese, the system doesn’t understand Chinese. In this case, none of the humans can recognize cats, but the collective can.
The flaw is the unsupported assertion that the whole system being conscious of X depends on a part of the system being conscious of X. The same assertion would fail here in the same way.
(also, popular conciousness forgets that technically the Chinese Room argument is only arguing against the much narrower, and now philosophically unfashionable, "Hard AI" stance as it was held in the 70s)