I hate this red herring about whether a Turing Machine is a general purpose computation machine or not. Off topic and irrelevant. Maybe, like qualia, we can side step it with a synonym. How about GPCM (general purpose computation machine)?
Thought experiment time!
1) We wire a sufficiently advanced GPCM to a spider after removing its brain, programmed to do exactly what a spider's brain does in handling input and output and confirm that the spider does everything a real spider does in every way.
2) We do the same thing with a human. We get the same result. It even spontaneously sings rhapsodically about how the subjective experience of redness must be somehow immaterial and incomputable, even though it was not specifically programmed to do that.
Can #2 happen? If not, why not? If it happened, would it be conscious?
The only reason I harp on it, Mr. Scott, is because certain people play semantic games to avoid giving an honest answer to your question #2.
For example, I suspect something like "but a Turing machine can't account for real time events, so if it was hooked up to the human body it couldn't be a Turing machine. So clearly computation alone isn't responsible for consciousness," could be a counter-argument. Which is just another way of stating that a GPCM has some aspects that an idealized Turing machine does not -- well, durr, because a GPCM is real and a Turing machine is not.
Except, that is just a semantic triviality. It doesn't even begin to address the serious and honest question you bring up of "if you can replace the brain with a computer, what are the implications?" It is simply a sidestep to avoid having to even think about "what if."
Many people here have already had discussions about your #2, and furthermore various variations that are more or less radical.
For instance, what if you didn't even use GPCMs? What if you simply took real biological neurons and constructed a human brain from scratch -- exactly to the blueprints of an existing one, mind you -- and then hooked it up inside a body's head. Would that be conscious? Curiously, a few posters here have refused to give a response to even this seemingly non-contentious hypothetical.
A little more radical, instead of replacing the brain wholesale with the GPCM, what if you started at a lower level and just replaced some of the interactions between ion channels ( or any other proteins for that matter ) with the GPCM analog? Or what about then entire neuron? Then you have a brain made of artificial neurons run by GPCMs, that is still physical and still the size/shape of the normal brain. Is that more acceptable, for the sake of argument, to some people?
Then we have a situation like you are asking about -- just replacing the whole brain with a single GPCM. But why stop there? How about just replacing the whole person with one? Or why not the whole environment of the person as well?
Which leads to the very interesting question -- that cannot be answered, by the way -- of whether or not we are in a simulation ourselves, being nothing more than information processed by some GPCM.
All of these are very good things to think about. The problem is, human emotion gets in the way, and invariably people start to stop and cling to irrational positions.
For example, it took me quite awhile to agree that stepping in the teleporter, and having the source destroyed while the destination is merely a copy, is mathematically and therefor physically no different than what occurs to us every instant of our normal lives. Now that I understand it, I would have no hesitation in stepping into the teleporter. However I know that many people who are supporters of the computational model of consciousness still refuse ( irrationally, in my opinion ).
I think that most intelligent people, thinking about these issues, will eventually arrive at what seems logically evident. For example, if you replace certain parts of neurons with computers, the person is still gonna be conscious. But to then make the jump to having the whole brain replaced with a computer ... well, that takes more logical override of emotion than many people have. And to be honest it still seems strange, even to me. However I can't just discount logic and mathematics because it seems strange.