westprog
Philosopher
- Joined
- Dec 1, 2006
- Messages
- 8,928
Name one thing in reality that can not ever be sufficiently simulated (or emulated or whatever) in a computer or robot to provide the inputs necessary for consciousness to emerge.
"Consciousness to emerge". That doesn't sound particularly rigorous to me.
If you can't name anything, then YOU are the one invoking magic. Not me.
Could you perhaps precisely describe what is scientifically necessary for "consciousness to emerge". Just what "inputs" are needed?
A bunch of vague waffle, a demand to disprove it and the obligatory reference to magic.
Making an argument that a "simulation is not the real thing" is irrelevant. An artificial heart is not the real thing, either, and yet it can pump blood as effectively (or even more efficiently) than the real thing.
We know this because we fully understand the functionality of the heart. We can precisely define what it does, and what we want it to do. Apart from anything else, we can replace the heart (albeit temporarily) with a simple pumping system and see that it works.
In the case of consciousness, the claim appears to be that because the term is so vague, and the functionality so much more complex, that it's possible to make equally vague assertions and demand that someone disprove them.
Of course they would say it is inconclusive. What they are NOT doing is offering principles that would make it impossible.
Maybe you should study what science already knows about consciousness. We don't know everything, yet. But, what we do know seems contrary to the notion that consciousness can't ever be simulated or emulated in a machine.
"Machine" is good. "Machine" encompasses everything.
Another frequent feature of this discussion is to make very precise and specific claims - relating to algorithms, Church-Turing and computation - and then rephrase it in a totally open way. Because if someone denies that a machine might duplicate the functionality of the brain - well, that's just a claim of mysticism and magic and god.
Obviously if consciousness is associated with particular physical processes, then a machine that duplicated those processes would produce the same effect. The computation claim is that no particular physical processes are essential to consciousness, as opposed to every other process going on in the human body.
I suppose it depends on what points one is trying to make.
Last edited: