Has consciousness been fully explained?

Status
Not open for further replies.
Piggy, if I could steer this in a different direction, do you think that it's actually possible to describe consciousness in terms of "physical" processes? Let's say we get to the point that we are able to pinpoint the processes that lead to the generation of consciousness. We know which physical systems would result in consciousness and which would not. And we can give a detailed description of the distinction and of how these conscious systems work structurally and mechanically. Could such a description adequately tell us what consciousness is? Or, to put it another way for example, could a detailed description of a rabbit's conscious system tell us what it is like to be a rabbit? That is, can that "what it is like" be described in terms of physical processes by biological science?

I ask, because this is where the mind/body problem rears its head for me. I'm not a dualist (due to major problems I see with dualism), but I have to answer no to this question.

I don't know if that question is strictly answerable at this point, but this is how I see it.

If there's an overlap with another species' physical apparatus, then we will obviously know to some extent "what it's like" to experience the world the way that animal does.

Where there are differences, we probably will not know until/unless we are able to find ways to wire up our brains to simulate such an experience.

So let's say, for instance, that we confirm that birds are conscious. Well, we know they're sensing magnetic fields in a way we don't. And this sensation would surely "feel real" to them in some sort of way.

But we'd have no way of guessing what their conscious experience of it would be like.

It's like if we met an alien species that didn't use chemicals to produce the sense we call "smell", I don't see how you could ever explain it to them if you talked their ears off (or whatever they had).

Could you wire a human brain to roughly simulate a cat's experience?

Maybe, in a way. You could start by boosting the signal from our emotional centers, cranking up the olfactory sense, tamping down or offlining many higher-level cognitive functions.

But even then, you wouldn't know what it feels like for a cat to smell the mix of scents on your back patio, because you wouldn't know how to map your heightened aromatic experience over to your gonzo emotions.

And it may prove impossible to make a human brain even produce the mix of emotional stimulants that cats are using, much less coordinate it with the triggers that have evolved in their brains over millennia, much less weave that in with everything else that's contributing to "what it's like to be a cat".

At the end of the day, we may have to accept that the actual experience of being anything else but human simply lies beyond our knowledge, like whatever's beyond our light cone, or the precise position and momentum of a particle.

ETA: If you wired up your brain so it was a cat's brain, you wouldn't be you anymore -- in other words, it would be a cat's brain made from pieces of what used to be your brain; but if you try to wire a simulated cat's brain so it feeds into your brain, so much will be lost in translation that you still won't be able to experience what it's like to be a cat, because you have no choice but to experience the world as a human does.
 
Last edited:
Why are humans hardwired to take abstractions more seriously than the real thing?

I think it's because we have to be able to treat some abstractions, mentally, in the same way we treat physically real things.

That's what allows us to mentally manipulate (analyze, question, predict) a hunting expedition, our relationships with other people, north, and all sorts of other stuff.
 
That's because those computer simulation don't physically perform the same task. A simulated power plant doesn't produce power, it only outputs information about how a power plant produces power.

But take for example an artificial heart: assuming it works properly it performs the same function as a natural heart. It pumps blood. It takes in blood and puts out blood. The way I see it, the brain is a sort of information pump. It takes in information from the senses and the rest of the nervous system, and it outputs information to the rest of the nervous system, the muscles and other actuators. If it is possible to create a computer simulation of that brain, and that computer feeds all that information from a real body into the simulation and sends out the output of the simulation back to the body the same way the brain would do, then computer and simulation together function in exactly the same way as a brain would. They would physically perform the same task as the brain.

If my brain was replaced with such a computer running such a brain simulation program, the new computer controlled me would certainly not be any less confused about what you mean with "Sofia" as I am now. I see no reason why it would suddenly say "Piggy, I now understand what 'Sofia' is/are. Previously when I was a brain I had them, but now that I am a computer program I suddenly notice that they are missing." I am pretty sure it would consider your Sofia concept the same way as I do now; basically you just gave problematic concepts such as "consciousness", "awareness", or "soul" new name.

But you're confusing simulations and models again.

The artificial heart is a functional model, not a computer simulation which would be useless.

A computer program cannot, by itself, produce Sofia because Sofia is generated by a bodily function, and the outcome of a calculation cannot be a real object, emission of energy, or event.

The problem with that thought experiment is that it isn't really about a simulation. It's about a functional model made out of computer parts.

(And yes, the simulation is different, because the simulation can't actually send brain waves across a 4-D brain.)

If that functional model can do everything the actual brain does, then it can do everything the brain does. In theory, such an artifical machine-brain could certainly exist.

But computation alone cannot create a functional model brain. There's no reason to say that computers couldn't be part of a model brain, but consciousness itself -- an actual Sofia event -- cannot be the solution to any calculation or set of calculations, for the same reason that a seed, or a jolt of electricity, or a kick in the gut, or sweating cannot be the solution to a calculation.

In other words, you couldn't substitute a computer-sim brain for a real brain (for the same reason you couldn't substitute a computer-sim heart for a real heart) because the computer-sim brain, just like the computer-sim power plant, "only outputs information about how a" brain does what it does.

To run your thought experiment, you'd have to build a model. If that model brain used a CPU, it would also need some hardware to actually do what the brain does (or an equivalent) when it cranks up Sofia.
 
And that objection is missing the point anyway. What is important is the part of the computer that is controlling the simulated power plant, the part that is controlling the electricity, and routing the power, or whatever a computer system might do in a power plant. If this computer was hooked up to a real power plant, it would be doing the same things.

A computer running a simulation of a human is not only simulating the particles, it is controlling them. A computer that was just simulating a lump of matter is different than one controlling a perfectly simulated human body in such a way that we would call it a perfect simulation of a human.

No, there is absolutely no difference.
 
Basically, yeah, that's what it is. Piggy is arguing for dualism. That's why every argument he comes up with misses the point.

Good luck pointing out any dualism in my view of the world, or my arguments.

What it boils down to is that the computationalists either don't (won't) understand that consciousness is a bodily function -- and what could it be if it is not? -- or else that computation cannot result in a real-world event, only the solution to the computation.
 
The onus of proof is on the people claiming that in this one, special, particular case the simulation is the same as the thing simulated.

Precisely. And it seems that the Church-Turing myth was the only evidence they had.
 
To run your thought experiment, you'd have to build a model. If that model brain used a CPU, it would also need some hardware to actually do what the brain does (or an equivalent) when it cranks up Sofia.
No argument there. The computer would need the hardware necessary to receive nerve signals from the body before it can use them in the brain simulation, and it would also need the hardware to send the nerve signals the simulated brain produces to the body. If this can be achieved (and there is no reason to suspect the physical connection between nervous tissue and computer is going to be hard part in this experiment) this hardware, the computer and the simulated brain will together form a functional model of the brain.

It would not be dissimilar to an artificial heart; while it may not physically work in exactly the same way, since its input and output are the same and runs the same logic it will do everything the brain does. Including producing "Sofia events" whatever they are.

Neither a disconnected brain or a disconnected simulated brain can produce those "Sofia events" or "consciousness" or whatever you want to call it. They require a body. That just means that they aren't localised phenomena in the brain, but rather the result of the interconnectedness of the brain and body. If they were localised in the brain, a disconnected brain or a disconnected simulation of a brain would produce them as well.
 
No argument there. The computer would need the hardware necessary to receive nerve signals from the body before it can use them in the brain simulation, and it would also need the hardware to send the nerve signals the simulated brain produces to the body. If this can be achieved (and there is no reason to suspect the physical connection between nervous tissue and computer is going to be hard part in this experiment) this hardware, the computer and the simulated brain will together form a functional model of the brain.

No, that wouldn't work, because as Westprog has pointed out, the programming for a computer running a simulation of a spray-painting machine is not the same as the programming for a computer running a spray-painting machine.

It's extremely important to understand the truth of this point.

The functional model of the brain must do (or perform some functional equivalent of) everything the actual brain is doing in 4-D spacetime which impacts the generation and maintenance of consciousness, including coordinating those brain waves at the same time that it's coordinating "highly processed" output from various areas of the architecture.

The computer running the simulated brain is simply producing outputs to calculations which really have no bearing on the outputs that the functional model of the brain would be producing.

It would not be dissimilar to an artificial heart; while it may not physically work in exactly the same way, since its input and output are the same and runs the same logic it will do everything the brain does. Including producing "Sofia events" whatever they are.

No, this is simply incorrect.

A model brain would work in the same way as an artificial heart.

A simulated brain would "work" in the same way as a simulated heart.

You cannot run a real body on a simulation of a human heart.

You can run a real body on a functional model of a human heart.

You cannot run a real body on a simulation of a human brain.

You could run a real body on a functional model of a human brain.

Neither a disconnected brain or a disconnected simulated brain can produce those "Sofia events" or "consciousness" or whatever you want to call it. They require a body. That just means that they aren't localised phenomena in the brain, but rather the result of the interconnectedness of the brain and body. If they were localised in the brain, a disconnected brain or a disconnected simulation of a brain would produce them as well.

Sofia events don't require a body, strictly speaking. You can dream with no input from the body getting through.

So a "disconnected" brain can certainly generate Sofia. In fact, all the evidence demonstrates that the brain alone creates the phenomenon.

But that has no bearing on the inability of a simulation to do what a functional model does.
 
Precisely. And it seems that the Church-Turing myth was the only evidence they had.

I'd love to see a quote from some reputable authority which supports the claim that Church-Turing proves that consciousness is a Turing property. All the references I've found explicitly deny this.

And the claim that consciousness being a physical process equates to dualism is just bizarre. It has no basis whatsoever. At least RD likes to try to track down my hidden agenda.
 
But you're confusing simulations and models again.

The artificial heart is a functional model, not a computer simulation which would be useless.

A computer program cannot, by itself, produce Sofia because Sofia is generated by a bodily function, and the outcome of a calculation cannot be a real object, emission of energy, or event.

The problem with that thought experiment is that it isn't really about a simulation. It's about a functional model made out of computer parts.

(And yes, the simulation is different, because the simulation can't actually send brain waves across a 4-D brain.)

If that functional model can do everything the actual brain does, then it can do everything the brain does. In theory, such an artifical machine-brain could certainly exist.

But computation alone cannot create a functional model brain. There's no reason to say that computers couldn't be part of a model brain, but consciousness itself -- an actual Sofia event -- cannot be the solution to any calculation or set of calculations, for the same reason that a seed, or a jolt of electricity, or a kick in the gut, or sweating cannot be the solution to a calculation.

In other words, you couldn't substitute a computer-sim brain for a real brain (for the same reason you couldn't substitute a computer-sim heart for a real heart) because the computer-sim brain, just like the computer-sim power plant, "only outputs information about how a" brain does what it does.

To run your thought experiment, you'd have to build a model. If that model brain used a CPU, it would also need some hardware to actually do what the brain does (or an equivalent) when it cranks up Sofia.

I think that part of the confusion is in due to the fact that Turing machines are held to be the model of choice when describing the operations of a computer. In fact, while a Turing machine is an excellent model of computation, it's not a good way to model a computer. Computers, in almost all modern applications, have to fulfil a role in some ways similar to the real-time control and monitoring carried out in the brain, reacting to interrupts with signals. It might be that a computer, in carrying out this function, would be conscious. However, for the pure Turing function of the computer to be able to produce consciousness would be to imply that the main function of the brain - monitoring and control - is irrelevant. This is implausible, to say the least - but it's what is implied by the assertion that the performance of an algorithm is sufficient.
 
The computer running the simulated brain is simply producing outputs to calculations which really have no bearing on the outputs that the functional model of the brain would be producing.
If the simulated brain produced very different outputs than a brain would, it wouldn't be a very good simulation. I think it is be pretty clear that I am talking about a hypothetical very good simulation that does everything a brain does.

A model brain would work in the same way as an artificial heart.
A simulated brain would "work" in the same way as a simulated heart.
I think you are making a meaningless distinction between "model" and "simulation".

You could run a real body on a functional model of a human brain.
You haven't explained how computer connected to a body running a very good simulation of a human brain cannot possibly be a functional model of a human brain.

You can dream with no input from the body getting through.
Really? I like to see your evidence. Has there ever been a brain that dreamt while not getting any input from the body?

But that has no bearing on the inability of a simulation to do what a functional model does.
I think it has. A good simulation does everything a functional model does, except perhaps getting input from the real world. If a disconnected brain, or even a disconnected physical model of the brain produces some phenomenon, then a virtual model that is as detailed as the physical model, must be able to do the same thing.
 
I don't know if that question is strictly answerable at this point, but this is how I see it.

If there's an overlap with another species' physical apparatus, then we will obviously know to some extent "what it's like" to experience the world the way that animal does.

Where there are differences, we probably will not know until/unless we are able to find ways to wire up our brains to simulate such an experience.

So let's say, for instance, that we confirm that birds are conscious. Well, we know they're sensing magnetic fields in a way we don't. And this sensation would surely "feel real" to them in some sort of way.

But we'd have no way of guessing what their conscious experience of it would be like.

It's like if we met an alien species that didn't use chemicals to produce the sense we call "smell", I don't see how you could ever explain it to them if you talked their ears off (or whatever they had).

Could you wire a human brain to roughly simulate a cat's experience?

Maybe, in a way. You could start by boosting the signal from our emotional centers, cranking up the olfactory sense, tamping down or offlining many higher-level cognitive functions.

But even then, you wouldn't know what it feels like for a cat to smell the mix of scents on your back patio, because you wouldn't know how to map your heightened aromatic experience over to your gonzo emotions.

And it may prove impossible to make a human brain even produce the mix of emotional stimulants that cats are using, much less coordinate it with the triggers that have evolved in their brains over millennia, much less weave that in with everything else that's contributing to "what it's like to be a cat".

At the end of the day, we may have to accept that the actual experience of being anything else but human simply lies beyond our knowledge, like whatever's beyond our light cone, or the precise position and momentum of a particle.

ETA: If you wired up your brain so it was a cat's brain, you wouldn't be you anymore -- in other words, it would be a cat's brain made from pieces of what used to be your brain; but if you try to wire a simulated cat's brain so it feeds into your brain, so much will be lost in translation that you still won't be able to experience what it's like to be a cat, because you have no choice but to experience the world as a human does.

I pretty much see it the same way. But this seems to mean that we cannot describe conscious experience in terms of physics. Yet for something that cannot be described in terms of physics to have such a seemingly direct cause/effect relationship with a physical system seems bizarre. If it is only effect, but not cause then it seems unlikely for it to have evolved since it would offer no evolutionary advantage.
 
Dualism fail.

You're going to have to do better than that.

There is no dualism in saying that bodily functions are carried out physically.

Nor is there any dualism in noting that the outcome of a calculation cannot be a real event.

If you have something to say, you're going to have to say it so it makes some sense.
 
I pretty much see it the same way. But this seems to mean that we cannot describe conscious experience in terms of physics. Yet for something that cannot be described in terms of physics to have such a seemingly direct cause/effect relationship with a physical system seems bizarre. If it is only effect, but not cause then it seems unlikely for it to have evolved since it would offer no evolutionary advantage.

Indeed. The problem is that for physics as we currently understand it 'something' appears to be missing. Admitting that (for some of us) seemingly opens the door to 'magic' and/or 'dualism', since as yet the answer isn't apparent.
 
Last edited:
If the simulated brain produced very different outputs than a brain would, it wouldn't be a very good simulation. I think it is be pretty clear that I am talking about a hypothetical very good simulation that does everything a brain does.

It is cricital to understand that the level of detail of the simulation doesn't matter.

To explain this, let's consider the simulated power plant first.

The programming for the sim-plant has to tell the computer what to do in order to produce stuff like pixels on a monitor that look like gauges or readouts or even machine parts, interfaces that let users tweak the specs, figures such as the total gigawatt output per unit of time, etc.

The programming for the real plant might produce reports including that last item there, and it will have user interfaces and displays but they will be of very different sorts. And it will have gobs of programming for actually running the physical plant, which the sim programming will lack.

On the other hand, the sim will have programming that the on-site computers lack, such as information about qualities of the walls.

You can get as detailed as you want, and the programming for the sim will never be able to run the actual power plant.

And for our purposes here, a human body or an organ in the human body is no different from a power plant or a race car or anything else.

It doesn't matter how accurate a simulation of the brain it is, it still can't run a human body because no sim can do that, only a functional model can. It's exactly the same as a computer simulation of a leg or a transistor -- you can't substitute it for the real thing.
 
I think it has. A good simulation does everything a functional model does, except perhaps getting input from the real world. If a disconnected brain, or even a disconnected physical model of the brain produces some phenomenon, then a virtual model that is as detailed as the physical model, must be able to do the same thing.

There's a distinction between a program which can simulate a system, and a control program which actually operates it. The two programs might have a vast amount of code in common, but their fundamental operation is different.

All that matters for the simulation is that it produces the same eventual output as the actual system. A typical simulation is synchronous, deterministic, and doesn't interact with its environment - IOW, Turing compatible.

It's certainly possible to have a simulation which is asynchronous, non-deterministic, and which interacts with some kind of environment. Such a simulation program could be used as a direct replacement for a control program. However, such a program could not be a pure Turing program. Indeed, such programs wouldn't normally be considered simulations - they'd be considered control programs embedded in a test environment. (However, as many programmers know to their cost, a test environment doesn't necessarily duplicate the real thing!)

The important point is that contrary to what has been alleged, it is not possible for a Turing program to carry out the function of the brain - and the persistent assertion that it is seems like a total blind alley in the search for a physical theory of consciousness.
 
You haven't explained how computer connected to a body running a very good simulation of a human brain cannot possibly be a functional model of a human brain.

It's pretty simple, actually.

Digital simulations of physical systems are run on computers that only do things that computers do. Whether the computer is running a sim of a traffic pattern, or an object being crushed, or weather systems, or a frog's leg muscle, or anything else, it continues to do what computers are designed to do, and nothing more.

The simulations they produce can't be used to do the things that the simulated objects actually do in the real world. You can't walk on a digital simulation of a leg. Instead, you have to build a 4-D-spacetime functional model to use.

That's because simulations (in the way we've been using the term on this thread, in contrast to models) are abstractions, and do not affect the workings of the machines that support them, whereas a model actually does all of the necessary and sufficient actions to serve as whatever it's modeling.

No simulation can serve as a functional model, because the bits and parts of the machine supporting the simulation simply aren't doing what the bits and parts of the model have to do.
 
Status
Not open for further replies.

Back
Top Bottom