There is a fundamental difference between the way you teach a child and the way you can teach a computer. With a child, you point to a red tractor and say "red tractor". Eventually the child associates experiences of red and of tractors with the words "red" and "tractor". The only way you could to this to a machine was if the machine was conscious.
Since this
has been done, we can only conclude that computers are already conscious.
If all it is doing is processing information then it simply cannot associate any meaning with the word "red".
Whyever not? What is meaning if not information?
It is just a label that it is told applies to certain other things, which are also just labels.
Except that this is
exactly what you are doing with the child.
All the computer can do is shuffle symbols about. All it can do is put "red" and "tractor" together to make "red tractor".
No. It can associate the terms, individually and in combination, with, for example, a picture of a red tractor. Which is exactly what we do.
It then says "tractor is red", but it never "knows" what any of the symbols actually mean, which is the point of Searle's Chinese Room argument.
That was indeed the point of Searle's Chinese Room argument, and it is something that Searle failed utterly to show.
To recap the argument: We have a man in a room full of books. Through a hole in one wall come messages written in Chinese, a language he can neither read, write, nor speak. He consults the books, and following simple functional rules, constructs a second message from the first - without
ever translating either message into any language he understands.
But the second message is in fact an answer to the question in the first message. To answer the questions, the Chinese Room must understand Chinese.
The man, as we have stipulated, does not understand Chinese.
Books are merely data; they represent knowledge, but they certainly do not understand anything.
Searle and his supporters see this as a comprehensive refutation of functionalism. Functionalists see this as obvious nonsense. It's neither the man nor the books that understands Chinese, it's the combined system; the man providing the logical processing and the books providing the information.
Searle's response to this is to consider the case where the man has memorised the contents of the books. Now he does everything himself, and yet he
still does not understand Chinese. He cannot tell you what any of the qestions means, or even the answers the he himself has written.
This rejoinder misses the point by a parsec. The man has
constructed a new consciousness using his own conscious processing, just as he did in the Room. That consciousness understands Chinese, just as the Chinese Room understood Chinese. That it is happening inside his brain rather than by the interaction of his brain with the books makes not the slightest difference; it is exactly the same argument and fails in exactly the same way.