• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
A Turing equivalent machine can't necessarily replicate real-world rate of change. The simulation won't change at the same speed as the real system.

I'm not sure I see the relevance, but if this is an important point for you, why couldn't it run at the same speed? I don't see the obstacle.



I don't get your point, though. You say consciousness is not a physical interaction. Then you say it's produced by the physical interaction of neurons in the brain, so duplicating that physical interaction should produce consciousness. So how does it help your point to say consciousness is not a physical interaction?

Possibly answered at the end.

Double standard.. It's a bare assertion for me to say software cannot replicate biological components, but it's not a bare assertion for you to say that software can replicate biological components?

http://dictionary.reference.com/browse/replicate
1. ( also intr ) to make or be a copy of; reproduce

If I can distinguish between real biological components and software simulatoins of them, then how is it "replicating" them?

I think we're into semantics. Here's an analogy to explain my point. If you put a piece of paper in the copier, I think it would be fair to say the contents of that paper are replicated to a more or less perfect degree depending on the copier. However, if the copier used a different kind of paper than the original, you could still distinguish the original from the copy when they're next to each other, even though the actual words/information are identical.

I'm not saying that because copiers copy, simulations can be conscious. I'm just trying to get to a point where we at least understand each other.

How are the physical interactions in a software simulation the same as the physical interactions in the brain? If you're talking about unimplemented software, then there isn't really any physical interaction. If you're talking about, say, a computer implementation of the software the physical interaction involved with electrons moving through gates is the same as the physical interaction involved with neurons firing in the brain?



You're saying simulated neurons aren't physical? If so, how are they simulated?

If you want to grant me that simulated neurons are physical for the necessary purposes, that's wonderful. I'd just gone a few rounds with Piggy whose viewpoint has a sharp divide between the simulated world and what we usually refer to as the 'real' or physical world, and I thought you were coming from the same direction.
 
Which is fine as far as I am concerned, but there has been considerable acrimony in this thread over the use of particular words and ideas.

Talking about the simulation is just an easier way of discussing it, I think. It is a higher level abstraction, more on our normal level of thinking.

I agree; and agreeing on the meaning and use of particular words and ideas is also important. Although the main problem I notice on this thread is that some members seem unable to grasp the concept of such a simulation, as if they have some conceptual blind spot about how it would work, and the meaning of 'reality' within such a simulation.
 
Last edited:
... some conceptual blind spot about how it would work, and the meaning of 'reality' within such a simulation.
Like, you mean, there isn't any? That blind spot?

Some appear to to think it resides on electrons moving through gates.

Others may think they're viewing it as phosphors on a crt. Or as a hologram, maybe?

Where do you find the reality therein?
 
Like, you mean, there isn't any? That blind spot?
No, I mean that some appear not to be able to conceive of the reality that might exist in the simulation.

Some appear to to think it resides on electrons moving through gates.

Others may think they're viewing it as phosphors on a crt. Or as a hologram, maybe?
Possibly - that's for them to say. Your point?

Where do you find the reality therein?
It seems to me that within such a simulation there can be a perception of reality that is to the simulants as our perception of reality is to us.

YMMV.
 
So consciousness would arise when a particular combination of switches are utilized?

That would create a scenario where enough monkeys flipping switches like these would create consciousness:
[qimg]http://www.dicts.info/img/ud/switch.jpg[/qimg]

Doesn't that seem kind of odd to you?

Consciousness requires, as best we understand it, a much higher level of organization than randomly flipping switches can provide.
 
OK, I see what you mean. I think it's just a difference in viewpoint. I prefer to see it as the simulation (the result of the execution of the software by the hardware) that's conscious, rather than the hardware or the software. Maybe it's because it all reminds me uncomfortably of Searle's Chinese Room...

I suppose I can see that. Does it also make you uncomfortable to say the brain is conscious? That's the same thing. I think it probably does make more sense to say a given system in action is conscious.
 
Not so. Simulations are done precisely because they duplicate the mathematical relationships in real objects. An accurate simulation of the brain will duplicate all important mathematical relationships within the brain.

You say "not so", but you aren't actually contradicting what I said. Duplicating mathematical relationships, which I'm not even sure is an accurate description of a simulation, is still not the same as duplicating real objects.

Since, as best we understand reality, all things can be described with math (leptons and bosons compose everything, and they and their interactions are described wonderfully with math), there's no reason to think the simulation would miss out on anything essential to modeling the behavior of the brain.

Just like a simulation of water and its interactions will accurately depict how it responds to stimuli, a simulation of the brain would accurately depict how it responds to stimuli. This would include all important features such as learning, emotions, consciousness, etc. Granted, the simulation would have to supply stimuli to the brain and properly model the input and output to it from nerves, but this can be done easily enough via simulating a complete environment or simply attention sensors to the outside world.

Even if that's the case, we'd have the difference between a perfect description/depiction of a thing and a thing itself.

Granted, the gross physical characteristic of the brain won't exist except in the simulation. So this brain wouldn't be made of organic matter. That stuff would only exist within the simulation. That's irrelevant, unless you are proposing there is something special within the actual matter that can't be captured mathematically. Again, this simulation would duplicate the response to information coming in on nerves that a real brain does, it would model the growth of new dendrites and new neural cells, and if you connected it to the real world via sensors and other interfaces that translate between the simulated world and real one, it could interact just like any human.

No, I'm saying representing the mathematical relationships of something isn't the same as duplicating it. To interact like a human, the machine needs a body. We can't just simulate its legs. We can't just simulate its eyes. We can't just simulate its nerves and muscles. But we can just simulate its brain?

Now, you might try to argue that the interface somehow contains consciousness, but this is like saying consciousness exists in our eyes or in the nerves on the skin. It's completely disproved by everything we know about the brain and how consciousness works. Also, you'd be insulting my mother,* so I'd take offense.

Not sure what you mean.

So the real question is, do you think consciousness is a result of neurons and other parts of the brain interacting with each other, or do you think it comes from somewhere else? If you think the the former, then the simulation will perfectly capture that interaction and will be conscious (well, as much as a human in a similar state would be).

I think the former. But no, the simulation will not perfectly replicate the physical interactions of the brain. We can distinguish between the two. Representing something is not the same as recreating it.

If you think the latter, then you are either arguing for dualism or claiming there is something about matter that can't be modeled with mathematics. That said, we have no evidence at all that anything in reality can't be modeled with math and in fact as best we understand it, everything can.

We can't model things just with math if model means "recreate".
 
Consciousness requires, as best we understand it, a much higher level of organization than randomly flipping switches can provide.

Would you say it requires more organization than people writing 1's and 0's on paper? Could consciousness arise from that?
 
False analogy.

A better analogy would be pulse rate. Your simulated heart would create a simulated pulse. However, the simulated pulse would have a rate. This pulse rate is the abstraction of what the heart is doing. This informational pattern will inevitably arise every time you perfectly simulate a heart.

All analogies break down, but yours was a false analogy from the getgo.

You think consciousness is something like a measurement, not something like a bodily function?

That's quite bizarre.

When I'm lying awake at night and can't get to sleep, it's because my brain won't stop something that it's doing. It won't cease a particular behavior. That's not an abstraction, it's a bodily function.

The analogy with the pulse holds.

Consciousness is not like a pulse rate, it's like a pulse.
 
Then I think we are agreed.

I think rather than computationalists actually thinking that it occurs through programming alone that this is probably a case of poor language use. I know I fall into that trap all the time. But that is just conjecture on my part.

I wish it were true, but whenever they have an opportunity to back off their contention, they don't. It seems very clear at this point that they really mean it.
 
I'm not sure I see the relevance, but if this is an important point for you, why couldn't it run at the same speed? I don't see the obstacle.

The relevance is you said pulse rate would be a better analogy and a rate is a function of speed. A turing machine is a serial processor and its speed depends on its implementation so it could be anything. So a turing machine simulation of a heart would not necessarily have the same real-time pulse as a real heart. That's all I was saying.

I think we're into semantics. Here's an analogy to explain my point. If you put a piece of paper in the copier, I think it would be fair to say the contents of that paper are replicated to a more or less perfect degree depending on the copier. However, if the copier used a different kind of paper than the original, you could still distinguish the original from the copy when they're next to each other, even though the actual words/information are identical.

Not sure that's a good analogy. It's too hard to relate it to brains and simulated brains. Anyway, the sense in which the copy's information is "identical" is observer dependent. The copy machine could also print out a string of 1s and 0s that represented the mathematical relationships of the symbols comprising the words, but the copy would no longer be functionally equivalent and we wouldn't call it a duplicate. From the perspective of someone who needs to read the paper, the first example is a functionally equivalent model and the second one is not.

If you want to grant me that simulated neurons are physical for the necessary purposes, that's wonderful. I'd just gone a few rounds with Piggy whose viewpoint has a sharp divide between the simulated world and what we usually refer to as the 'real' or physical world, and I thought you were coming from the same direction.

Simulations of neurons are physical. A simulation of a neuron is involves some physical process, such as manipulating symbols on tape or electrons moving through gates in a computer. Physically, a simulation of a neuron is very different from a neuron. There is no 'simulated world'.

My argument is, if you say that consciousness is a result of physical interaction, then it matters that the physical interaction in a brain is very different from the physical interaction in a simulator.
 
You think consciousness is something like a measurement, not something like a bodily function?

That's quite bizarre.

When I'm lying awake at night and can't get to sleep, it's because my brain won't stop something that it's doing. It won't cease a particular behavior. That's not an abstraction, it's a bodily function.

The analogy with the pulse holds.

Consciousness is not like a pulse rate, it's like a pulse.

This is just a lengthy "is not" so I'll just say "is too" and end my participation here.
 
You keep thinking there has to be a physical component for consciousness but you can't argue why, you have to keep resorting to analogies and so can never make headway in your argument. So, just for a refreshing change of pace, here is a completely analogy-free argument.

Consciousness is not a physical object or a physical action. While definitions for consciousness vary, it's commonly referred to as a certain kind of state. Specifically, one thing that crops up in a lot of definitions is 'awareness.'

We know that the brain can create such a state of awareness through the interactions of various biological components, primarily neurons.

So if we replicate those various biological components in a software simulation, we would expect to create the same state. The only difference is that the state would be formed within software rather than wetware, which is not a requirement to fit the definition of conscious.

There is no behavior without a physical cause.

Would you care to cite an example of any behavior without a physical cause? Or care to explain how it's even possible?

I assume no, since you have never done so.

We observe that the brain sometimes does consciousness, sometimes does not, and that it uses resources when it does which it is not using when it does not. Which means it's a bodily function with a direct physical cause.

This is all quite straightforward.

And it's not an analogy.

If we replicate these bodily functions using a functional model, then the machine we create will also be conscious.

If we run a simulation of a brain (or of our model) then the machine running the sim will not somehow magically become conscious itself, for the same reason that a computer running a sim of a power plant will not begin magically generating power.

No analogy there.

Consciousness is indeed an action, something the body does some of the time.
 
Simulations of neurons are physical. A simulation of a neuron is involves some physical process, such as manipulating symbols on tape or electrons moving through gates in a computer. Physically, a simulation of a neuron is very different from a neuron. There is no 'simulated world'.

My argument is, if you say that consciousness is a result of physical interaction, then it matters that the physical interaction in a brain is very different from the physical interaction in a simulator.

Alright, I'll bite. In what way does it matter that the physical interaction in a brain is very different from the physical interaction in a simulator?

My argument is that if the software neurons and wetware neurons (and all the other components) have the same behavior relative to other components within their relative system, that the same informational processing pattern should appear, thus consciousness.
 
This is just a lengthy "is not" so I'll just say "is too" and end my participation here.

Or you could attempt to explain how we get behavior without a direct physical cause, I suppose.

You could attempt to explain why the body uses measurable resources to maintain consciousness if it's not a bodily function.

Just a thought.
 
We observe that the brain sometimes does consciousness, sometimes does not, and that it uses resources when it does which it is not using when it does not. Which means it's a bodily function with a direct physical cause.

This is all quite straightforward.

And it's not an analogy.
If we replicate these bodily functions using a functional model, then the machine we create will also be conscious.
If we run a simulation of a brain (or of our model) then the machine running the sim will not somehow magically become conscious itself, for the same reason that a computer running a sim of a power plant will not begin magically generating power.

No analogy there.

Consciousness is indeed an action, something the body does some of the time.

I snipped your word games about 'physical'.

The highlighted portions are either a misunderstanding or a contradiction.

How do you imagine your machine gaining consciousness?
 
Perfect simulation, remember?

Doesn't matter, because the perfect simulation doesn't cause the physical apparatus running the sim to begin behaving like the simulated system.

A perfect simulation of a power plant won't cause the machine running it to generate power.
 
Or you could attempt to explain how we get behavior without a direct physical cause, I suppose.

You could attempt to explain why the body uses measurable resources to maintain consciousness if it's not a bodily function.

Just a thought.

Well ok. First you light the strawman on fire and watch it burn. Then you do something useful. And don't give me the 'logical extension' ******** when I've already explained why it isn't.
 
Not so. Simulations are done precisely because they duplicate the mathematical relationships in real objects. An accurate simulation of the brain will duplicate all important mathematical relationships within the brain. Since, as best we understand reality, all things can be described with math (leptons and bosons compose everything, and they and their interactions are described wonderfully with math), there's no reason to think the simulation would miss out on anything essential to modeling the behavior of the brain.

If a computer runs an accurate simulation of the heat death of the universe, will it cause the computer to stop working?
 
Status
Not open for further replies.

Back
Top Bottom