I'm not arguing that computation plays no role in consciousness, or that thinking is not a form of computation. The problem with computationalism is that it approaches the subject strictly as a computational abstraction. Consciousness is not simply a matter of performing the right ops; what is essential is the -kind- of system performing the ops. What we should be focusing on is finding out what substrates provide conditions sufficient for producing consciousness.
I'm not sure that I would agree with that. I just listed out all the potential problems that I can see with a digital computer duplicating what neurons do, but I'm not sure that there isn't a way to overcome these issues -- maybe the AI folks can help me out here.
Initially I thought that spatial summation might be a problem, but I don't think it is. Temporal summation is clearly no problem.
The yes/no, 0/1 easily duplicates what occurs at the axon hillock. The multiple inputs (dendritic synapses) can be duplicated with multiple weighted inputs.
The two big problems, as far as neurons are concerned, are these: (1) I don't see an easy way to duplicate the actions of metabotropic receptors which essentially can alter the membrane threshold for seconds to minutes (it's more complicated than that) or the other metabotropic receptors that cause longer term changes (essentially in threshold) by turning on certain genes; and (2) we start life with orders of magnitude more neurons than we now have with most dying because they do not receive input (they don't learn).
Other problems seem to me to be conceptual -- we view the nervous system and perception in particular ways that are simply not correct. We think that there are separate sensory and motor systems, but there is simply no such clear cut division. We don't so much receive visual information at higher levels as hypothesize what is out there and construct our image of the world based on the lower level inputs we have received. Seeing is an active process; we see things that we find reasons to attend to. The other issue concerns what emotion and feeling may be. If it is the case that they are processed sensory information with cognitive reasoning that promotes a behavioral tendency, then it isn't clear to me what type of computation would perform that task. One of the consequences is that it wouldn't be just computation; it would necessarily have to be causal activity -- a behavioral tendency makes no sense as a computation, but only within a causal system -- so this is only going to be possible in a robot. Doing it in desktop just isn't going to work.
I think we simply need to think of computation in a different way. Certainly not as pure abstraction, since that will never be able to do what needs to be done for consciousness. The brain works as a sensorimotor integrator. We cannot leave out the motor side of the equation and get an answer to this issue.