• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

On Consciousness

Is consciousness physical or metaphysical?


  • Total voters
    94
  • Poll closed .
Status
Not open for further replies.
The OP asks for a dialogue on the nature and computability of consciousness, so imaginary conscious machines would seem to be OK to discuss here, along with discussions of the only conscious objects we can agree on -- animal brains.

Of course it's OK to discuss here.

I can't imagine it's productive, but it's certainly OK.

It's just not anything I care to discuss, so I'll bow out of that portion of the thread.
 
I just didn't communicate everything that was in my mind and expected y'all to be charitable and fill in the gaps. My mistake.

It's been said that a computer, perfectly simulating every function of the brain, could not be conscious because consciousness is a "performance" or "real world event" that a functioning computer isn't doing. In fact, a computer running a program is, indeed, performing a real world event. E.g. a computer game in self play mode with the screen and speakers off and no one touching the controls is performing the game, and this performance is real world, since the computer's switches are functioning in the real world. Now, explain to me how a brain, in a paralyzed body in sensory deprivation but still conscious, is "performing a real world event" that is more than a computer game in self play with IO off.

Yes, but that game only exists in a system which includes the machine and an observing brain capable of interpreting the symbolic output as a game.

If all humans were to go extinct while the computer is running, the "game" would disappear, and all you'd be left with is light and sound displays.

The computer itself would have no experience of any game.

Just had to point that out, now I'll try to keep my promise and shut up about it.
 
Of course it's OK to discuss here.

I can't imagine it's productive, but it's certainly OK.

It's just not anything I care to discuss, so I'll bow out of that portion of the thread.

I can only conclude that you have no understanding of the power of computer simulations.
 
Yes, but that game only exists in a system which includes the machine and an observing brain capable of interpreting the symbolic output as a game.

If all humans were to go extinct while the computer is running, the "game" would disappear, and all you'd be left with is light and sound displays.

The computer itself would have no experience of any game.

Piggy, this sounds a bit circular so correct me if I'm wrong. Are you assuming from the get-go that a non-brain, say a computer playing the game or observing the situation, doesn't count ?

I mean, who cares if it's still called "a game" or not ? It's still running and still operating the same.
 
OK. There are simulators, and there are the things they simulate. It's not symmetrical. A "Universal Turing Machine," which is what all modern computers actually are (though optimized and with IO), has been shown to be able to simulate anything. The things it simulates, though, cannot necessarily simulate a simulator.

E.g. a driving simulator can be made to simulate any car. A car is not made to simulate a driving simulator.

Maybe Pixy called the brain a Turing Machine. I might agree with him in the context of how he was saying it. But, the brain is not a Turing Machine in the strict sense. That a Turning Machine can simulate a Tornado does not imply that a Tornado IS a Turing Machine.

That's what I'm trying to say about the brain. If you program a computer to simulate every neuron in a brain and all the interconnections, and run that simulation, why wouldn't it be "performing consciousness?" The performance of a computer program is in the real world.

Well, it's simply because it's a bodily function. If you program a computer to simulate digestion, does that mean you've made a machine which actually performs digestion in the real world?

Of course not.

If you want to build a machine that digests -- or maintains a blood pressure, or regulates its own temperature, or plays guitar, or whatever a human body can do -- then you have to build the machine to do it. Even if you simulate every single cell in the body in a digital virtual environment, no bodily function is actually performed by the machine running the simulation.

The generation and maintenance of a phenogram is bodily function. It happens in a specific part of the body at specific times. It's machine-generated by an organic machine.

If you want to build a synthetic machine which generates and maintains a phenogram in real spacetime, you have to build it. A computer can certainly be part of such a machine, but programming alone won't get you there.
 
Maybe Pixy called the brain a Turing Machine.
No, I didn't say that. But I did point out that - aside from the problem of it being a real-world computer and thus (a) finite and (b) necessarily prone to error - the brain is (a) provably no less powerful than a Turing Machine and (b) computable, and thus no more powerful than a Turing Machine.

That's what I'm trying to say about the brain. If you program a computer to simulate every neuron in a brain and all the interconnections, and run that simulation, why wouldn't it be "performing consciousness?" The performance of a computer program is in the real world.
There is a subtle question there of whether a closed simulation is conscious, but an open simulation - one that can communicate in any way - with those properties cannot fail to be conscious.

Consciousness is defined by the way it deals with information, not by any specific physical process, input, our output; any system that deals with information in the same way as a given conscious system is conscious by definition. Both systems are real-world physical processes - there's nothing else to be, after all. Piggy's argument on that point is simply meaningless.

The brain is a computer; it processes information, and we identify its behaviour by the information it generates, not by the waste heat and CO2 generated by the metabolism of the neurons. If you simulate it with another computer (and get the simulation right) the results will necessarily be the same. They can't not be the same. The simulation can't not be conscious. To insist on a difference is magical thinking.

Tornadoes are weather systems; we identify tornadoes as swirling columns of air. If you simulate a tornado by swirling columns of air around, you have a tornado.
 
Last edited:
I can only conclude that you have no understanding of the power of computer simulations.

It's not a matter of their power, it's a matter of their nature.

No simulation of a tornado poses any threat to anyone in the room with the machine running that simulation, no matter how powerful that simulation is.
 
Right. Which is dualism, at best.

Not at all. No dualism required. Only those who believe in a "world of the simulation" which somehow exists independently of both the machine and observers are engaging in dualism.

Pixy, you're free to fantasize that your thermostat is conscious.

But that's all it is -- fantasy.

If you want to be scientific about it, you'll need to first base your definitions and theories on observations of conscious objects -- mammal brains.

Then, you'll need to test your conclusions against observation of those same objects.

Einstein's theories predicted that gravity bends light, but it wasn't proven until we observed stars during an eclipse.

Your ad hoc definition of consciousness is convenient for you, but it's been proven unworkable in the lab. End of story.

You can go building castles in the air without checking your ideas against the reality of the behavior of conscious objects, but they have no real value, except as fantasies.
 
Well, it's simply because it's a bodily function. If you program a computer to simulate digestion, does that mean you've made a machine which actually performs digestion in the real world?

Of course not.

If you want to build a machine that digests -- or maintains a blood pressure, or regulates its own temperature, or plays guitar, or whatever a human body can do -- then you have to build the machine to do it.

That's an interesting analogy. If you want a machine that can digest food, then it needs to be able to ingest it, break it down, and use it as fuel or some other function, right ?

What do you need to make a machine operate like a brain ? Process data ?
 
Not at all.
As I said, it's dualism at best. If you like, you could claim your argument to be completely incoherent rather than merely inconsistent.

What that would gain you I couldn't possibly say.

Only those who believe in a "world of the simulation" which somehow exists independently of both the machine and observers are engaging in dualism.
Nobody claims that. Nobody has ever claimed that. It is simply a strawman you like to trot out to cover up your other fallacies.

Pixy, you're free to fantasize that your thermostat is conscious.

But that's all it is -- fantasy.
But it's your fantasy, Piggy. The entire point of Dennett's thermostat example is that a thermostat is not conscious. This has been explained dozens of times. You are clearly not paying attention to anything anyone says, so it's clearly not worth saying anything further to you.
 
The phenomenon we need a definition of is currently only known to exist in animal brains. Whether or not it actually is or becomes possible for machines to be conscious, we will never have any way of actually investigating the possibility without a much better understanding of the neurobiology first. I'm as interested as anyone here in the prospect, and I'm frankly thrilled by the notion that we might one day figure out how to build conscious machines, but Piggy is right that it's a complete waste of time to jump the gun at this point.
Except that Piggy is jumping the gun at this point. From our understanding of neurobiology, we can say that consciousness as it is usually defined* is an illusion, and that we haven't found anything in the brain which cannot be simulated in silico, in as much detail as we care to put into it. It's possible such a thing exists, which would indeed make it difficult or impossible for a machine to be conscious, but we haven't found it yet. Or anything like it. The null hypothesis at this point is that we will continue to not find anything, suggesting machines can be conscious as soon as they get beefy enough to run full-scale neural simulations.

That's the problem with Piggy's logic: he's rejecting the null without evidence. Though a machine could be programmed to have virtual neurons which interact in the same manner as a human's; though it may behave and walk and quack just like a consciousness; he's arguing that isn't really conscious, because it lacks that je ne sais quoi only to be found in real wetware. Much earlier in the thread Pixy derisively termed this idea a "magic bean." If you act conscious and have a magic bean, congratulations, you're conscious. If not, sorry bro.

Most of Piggy's latest walls of text have been a careful redefinition of the argument to make it seem like this isn't the case, so he can restate his assertions unchallenged. It's not working this round, either.


*The definition of consciousness tends to vary from person to person and moment to moment. It's one of those difficult concepts that people really like to hang onto, so it scuttles off into the darkness the instant you try to keep a light fixed on it.
 
Last edited:
Except that Piggy is jumping the gun at this point. From our understanding of neurobiology, we can say that consciousness as it is usually defined* is an illusion, and that we haven't found anything in the brain which cannot be simulated in silico, in as much detail as we care to put into it. It's possible such a thing exists, which would indeed make it difficult or impossible for a machine to be conscious, but we haven't found it yet. Or anything like it. The null hypothesis at this point is that we will continue to not find anything, suggesting machines can be conscious as soon as they get beefy enough to run full-scale neural simulations. <snip>

But Piggy isn't saying it's impossible for a machine to be conscious. He's saying he thinks computers, as we know them, aren't the right sort of machine. Brains do other things besides process information. They contain other tissues and structures besides a network of neurons. His claim is that something besides the network of neurons alone is responsible for consciousness.
 
But Piggy isn't saying it's impossible for a machine to be conscious. He's saying he thinks computers, as we know them, aren't the right sort of machine. Brains do other things besides process information. They contain other tissues and structures besides a network of neurons. His claim is that something besides the network of neurons alone is responsible for consciousness.
Yes. That's the magic bean. He can't say what it is or what it does or why it can't be simulated too, but damn if it isn't an essential prerequisite for consciousness.
 
His claim is that something besides the network of neurons alone is responsible for consciousness.

And it's a baseless claim unless he can show that, and to do that we need his definition of "conscious".

By the way, this is the first time I see "in silico", but I am so using that from now on.
 
Except that Piggy is jumping the gun at this point. From our understanding of neurobiology, we can say that consciousness as it is usually defined* is an illusion, and that we haven't found anything in the brain which cannot be simulated in silico, in as much detail as we care to put into it. It's possible such a thing exists, which would indeed make it difficult or impossible for a machine to be conscious, but we haven't found it yet. Or anything like it. The null hypothesis at this point is that we will continue to not find anything, suggesting machines can be conscious as soon as they get beefy enough to run full-scale neural simulations.
It's stronger than a null hypothesis, though. From everything we know about mathematics and physics, such a thing doesn't seem to be possible. We could get a surprise from mathematics if someone manages to formulate a class of computational process more powerful than the Turing machine (so far no one even has any coherent idea of how to define such a thing - the "Oracle" device is magical) - or from physics where.... Um. No, can't think of how physics could turn out to be such that anything Piggy says would make sense. Anyone?
 
It's not a matter of their power, it's a matter of their nature.

No simulation of a tornado poses any threat to anyone in the room with the machine running that simulation, no matter how powerful that simulation is.

...but an excellent simulation of a tornado will act just like a tornado. Why wouldn't an excellent simulation of a conscious brain act just like a brain and therefore be conscious?
 
Yes, but that game only exists in a system which includes the machine and an observing brain capable of interpreting the symbolic output as a game.

If all humans were to go extinct while the computer is running, the "game" would disappear, and all you'd be left with is light and sound displays. The computer itself would have no experience of any game.
Just had to point that out, now I'll try to keep my promise and shut up about it.

No, if humans were extinct and the computer was powered up and running it would continue experiencing the game. The game would not disappear.

If a brain running outside a body were dreaming, would it not be experiencing a dream if no other humans were around?
 
Last edited:
Piggy, you didn't answer my question:

If you program a computer to simulate every neuron in a brain and all the interconnections, and run that simulation, why wouldn't it be "performing consciousness?"
 
Piggy, you didn't answer my question:

If you program a computer to simulate every neuron in a brain and all the interconnections, and run that simulation, why wouldn't it be "performing consciousness?"

Yes, I did answer your question.

If you program a computer to simulate every molecule in the body, and run that simulation, why wouldn't it be "digesting"?

Answer that, and you have your answer for consciousness.
 
Status
Not open for further replies.

Back
Top Bottom