• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
No, it is more like I play something on a piano and say it is music. You ask what music is, I say it is a series of sounds played over time.
How do you distinguish music from any random sequence of sounds over time?

That's not needed for the purposes of this conversation, anymore than you need a generalized definition of language to talk about english (or learn it).
You need a generalized definition of language to communicate effectively about language.

"pattern". That is how it is generalizable.

Yes, pattern is a reasonable term to define information with. But then you have to define what you mean by pattern. In Shannon's information theory, IIRC, the more regular the pattern, the less information it holds.
 
How do you distinguish music from any random sequence of sounds over time?

That is, of course, an excellent question. But I think "what is art?" is rather outside the scope of this discussion. Hypothetically, one could point at any collection of sounds and call it music, I believe.

You need a generalized definition of language to communicate effectively about language.

Not necessarily at all. For instance, you do not need to know all potential sounds that exist or all potential sounds even humans can make with their mouths, tongue, and vocal cords to discuss that aspect of the English Language. You also don't need to a generalized understanding of syntax to understand English syntax in particular.

That's not to say there isn't merit for a broad talk on language in general.

Yes, pattern is a reasonable term to define information with. But then you have to define what you mean by pattern. In Shannon's information theory, IIRC, the more regular the pattern, the less information it holds.

And I did discuss what I meant by the pattern in sufficiently specific terms. I talked about how signals from nerves indicated intensity and so forth for instance. And how different patterns within the brain cause the neurons affect to respond differently. How is that not sufficient?
 
Last edited:
Can a simulated world make a real brain become conscious?

Huh? The brain makes the brain conscious. The external world does not make the brain conscious.

If we replaced all of your sensory neurons with artificial ones, hooked up to a simulation, would your real brain still be conscious even though all your sensory input came from a simulation?

My brain is my brain, regardless.
 
If one replaced all your neurons with computers that behaved like neurons because they ran a "simulation" of a neuron, would that be a model brain or a simulated one?

If you build any physical device which accomplishes the same physical tasks as an organic brain, then you have a model brain.

Just like if you build a phyiscal device which accomplishes the same physical tasks as a racecar, you have a model racecar.

Digital simulations of brains and racecars, however, do not make the machines running these sims behave like brains or racecars.

I'm getting really tired of repeating this same obvious fact over and over again.
 
If you then just replace the connections between the neurons with simulations as well, does the epistemology change?

Whoa, hoss.

Now you're getting sloppy again and conflating models and simulations.

Your statement here makes no sense.

You can't replace neurons with simulated neurons, only physical equivalents.

It's like saying "What if I replaced your thyroid with a digital simulation of a thyroid?" It's bunk.

I can only replace your thyroid with a functional model of a thyroid.
 
Actually, we don't. We can use a computer to perform a calculation for us, but it only constitutes addition when we interpret it as such. The operations performed by a computer aren't inherently addition. We decide how to interpret the results.

Yeah, I caught my own error after I made the post.

Sorry about that, chief.
 
Digital simulations of brains and racecars, however, do not make the machines running these sims behave like brains or racecars.

I'll give you the hypothetical that no one seems to want to deal with.

You have a machine that simulates the world, all the people on it, the solar system, etc. There's a simulated you, doing anything you'd do in that simulated world. As far as any simulation in the simulated world can tell, everything is real (since it is all simulated).

On what basis is the simulated person not conscious? Further, since you could be such a simulation, how can you say the simulated you in this scenario is not conscious, but you are?
 

Because you're trying to get two for one.

You say you only have enough hardware to accomplish A, but that the result is A and B.

It's like saying that I can make my computer play a CD by programming alone, with no more hardware than is necessary to support the software -- nothing to spin the disc, no laser, none of that.
 
Minor disagreement here. The physical behavior of the apparatus is different when it is running a sim compared with when it is not running the sim. As Wasp put it, its behavioral difference can be described in terms of electrons moving through gates.

Of course, I agree that there is no reason to believe that the machine will take on the properties of what it is simulating as a result of this behavior. Like, it won't become wet when simulating water or become conscious when simulating a brain.

But it's not significantly different.

It's like saying that if I'm doing a dance, and I change the steps, somehow I'll be doing a dance and also flying to the moon.

To fly to the moon, I have to do what's necessary to fly to the moon, which is different from what I have to do in order to dance. Changing the dance steps doesn't somehow make flying to the moon possible.
 
Why not? Programming helps define what is and is not information in a computer system, but there is always physical action being carried out.

And yet this physical action does not correlate to what we imagine the simulation to be simulating.

It all comes back to this: Simulating an aquarium doesn't make the computer wet, and doesn't make the computer swim.

Changing the simulation to a sim of a brain doesn't make the computer conscious.
 
C'mon Piggy, we needn't resort to that sort of response. If that is where we are headed I think we both better sleep it off.

No one that I know of claims to know how brains do it or how a computer might do it. The argument is only that since brains can do it, we should somehow be able to get a computer to do it with enough knowledge. I don't see any compelling argument that a computer couldn't.

No, the proper argument is that if brains can do it, then machines can do it.

Such a machine may well incorporate a computer.
 
What makes the cypher a cypher is the interpretation of a conscious mind. Otherwise it's just electrons.

Indeed. There seems to be a fundamental entification problem w/ those who are too accustomed to thinking of "information" in objective terms.
 
You could certainly replace a brain with a computer in theory. Connect up the nerves to a system that translates the impulses into something the computer can processes and respond to and the body can stay alive assuming the computer is programmed correctly. Better programming and such a cyborg could listen, talk, walk, etc.

Really?

How does that work, exactly?
 
I think you guys are using the word 'simulation' in entirely different ways and talking past one another.

If only.

The hardcore computationalists are very explicit in their claims that simulations can make the machines running them conscious.
 
Pixy is talking about simulating the whole brain, including input on a Turing machine.

So what?

The robustness or granularity of the sim makes no difference.

It's like saying that a sufficiently detailed drawing of a bird will become able to fly, mate, nest, lay eggs, and migrate.
 
Status
Not open for further replies.

Back
Top Bottom