Then why did piggy bring up that a very modern book on cognitive neuroscience has a whole chapter devoted to dispelling the ideas of the Hard-AI crowd?
Did you read the chapter? If not -- end of story.
That goes to his other point he brought up. Computation in nature is ubiquitous. Saying something is computational does not provide any new insight since everything is computational when looked at properly. Plus, computation really is of a different kind than what experience of sensation is.
Computation is about math and abstraction, experience of sensation is about what it feels like to be something. Those are very different categories.
I have explained a dozen times that the "computation" spoken of in the computational model is fundamentally about attractors in physical systems.
That invalidates all your statements about computation. You don't really understand what it is, and neither does piggy.
Yeah... and when someone points out that pulleys and cogs will do, that should be it for the Hard-AI position, done, finito, end of the story. There are humans, we are conscious because we experience sensation. There are no sensors on rope and pulleys. Rope and pulleys do not follow the same physics as we do (well, let me put that better, the physics that applies to us is different in many ways then the physics applying to us). The structure that the rope and pulleys gets put in to determines whether it is computing or not, not any property of the rope and pulleys itself (such as mass, charge...). There are lots of properties that if changed in a humans will cause unconsciousness. They are different.
I just think you don't have much imagination, no offense.
Because for you to state that ropes and pulleys and cogs will not do implies that you are capable of imagining what a system composed of trillions of ropes, pulleys, and cogs would entail.
If you had done so, I don't think your dismissal would have been so casual.
What I think is that you imagined some lame version, like maybe what one would see in a Rube Goldberg machine put together by college kids, and you just reach the conclusion that hey, adding
trillions more parts wouldn't change anything.
The above applies the same to modern computers.
The above applies to your conclusion about what applies to modern computers.
All research does not say consciousness comes from computation. The researchers themselves (as per the litany of quotes given by piggy earlier) do not have any solid ideas on that front. So just try and not be nihilistic about not knowing what causes consciousness, because you probably do not.
I already showed that piggy's quotes were mined and taken out of context.
He misrepresents the state of cognitive science and artificial intelligence, plain and simple.
Why don't you just look for yourself? Please, don't just take my word for it. Do some looking on your own.
I do not need to account for it since the relevant concepts are already out there. We are in agreement that focusing on red and just having red somewhere in your visual field are definitely different in a way. The first difference is that when one focuses on something visually, the object of focus, as far as I know, goes in the center of the field. There are probably other differences for which I am not aware of at the moment.
What do you mean you are not aware of the differences?
The difference is that when you focus on something you are consciously aware of it.
I have to question your ability to use introspection as a tool if you don't realize that the conscious awareness of an experience is different from the alternative.
Let me ask you this, I guess -- do you think it would be possible to have conscious awareness of the red frowny face, if there were no experience of red?