• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
There's a school of thought that maintains that consciousness consists entirely of the external behaviour of consciousness. This seems to me to be absurd. The very concept of consciousness is something that is experienced. External behaviour is secondary to this. If we design something merely to imitate the external behaviour, we might duplicate the internal experience, but there's no reason to be sure that we do, until we fully understand how the internal experience is produced.

We can't assume that an electric car produces the same emissions as an internal combustion engine, just because it looks the same.

And as we make advances in probing actual brains and beginning to understand what the brain is doing when it's conscious, and what it can do without the participation of those bits that handle the production of consciousness, that superficial approach goes right out the window.

As we make progress, we come closer to being able to define conscious awareness as a function of the physical brain.

When sufficient progress has been made, we should be able to determine whether other species are conscious by studying how their brains work and determining if they are performing the necessary and sufficient actions to make them conscious.

But already, this idea that we can make such determinations by observing behavior are dead in the water.

ETA: Well, not quite, actually, since it's certainly possible that we will be able to identify some behaviors which must require those brain functions. But we haven't yet.
 
Given that the focus on the machine running the sim rather than the sim itself seems to be your creation rather than mine, I'm not sure how to answer this.

My brain is running the computations and information processing that result in (or "is", depending on semantics) my consciousness. It's fed power and input from my body, but nothing in my body is generating my consciousness.

If you want to say that my body is not conscious, then that's fine. If that's how you want to use the language, I'll agree the machine running the sim is not conscious. That's how I'd use it myself.

eta: you could also use the analogy brain->machine, brain activity->sim, consciousness->consciousness.

The focus on the machine itself is necessary because the sim is an abstraction.

If you claim that you've done something that makes a machine conscious, you must mean that it makes the actual physical apparatus conscious in objective physical reality.

Talking about what is "real" within the imaginary world of the sim has absolutely nothing to say about whether any of those things are indeed real.

So if you want to talk about conscious machines, you must be talking about the machines. If you want to say that a machine running a sim becomes conscious, looking at the sim won't help you explain it -- you have to look at the machine.

I mean, bears have conversations with kangaroos in the Hundred Acre Wood, but this tells us nothing about actual animal communication.

And simulations of hurricanes can tell us a lot about actual hurricanes, but they tell us nothing about what would be required to build a machine that can blow the roofs off of houses.

ETA: No, your brain is not "running the computations and information processing that result in (or "is", depending on semantics) my consciousness". Why? Because consciousness is a behavior of your body, and as such has a direct physical cause, not any imaginary or abstract cause such as "information processing".
 
Last edited:
If you mean "the simulation would be conscious" in the same sense that you mean "the simulation just destroyed the planet earth" then you're fine.

All that happens in imaginary space. We look at the output and we imagine a conscious entity or a destroyed planet when none really exists.

On the other hand, if you want to claim that some real instance of conscious awareness is created, that's an entirely different proposition.

Ok, what's the difference between an imaginary conscious entity and a real conscious entity?

If you have a sim that has an entity in it that in all respects acts and behaves conscious within that imaginary space, how on earth can you say it isn't actually conscious?

This goes back to the question you never really answered about whether giving the sim entity I/O access to real machines and sensors would move it over the line to conscious in your usage.
 
The focus on the machine itself is necessary because the sim is an abstraction.

And as you conceded earlier, all thought is abstraction, meaning so what?

If you claim that you've done something that makes a machine conscious, you must mean that it makes the actual physical apparatus conscious in objective physical reality.

Oops, there goes that sim to machine hop again.
Talking about what is "real" within the imaginary world of the sim has absolutely nothing to say about whether any of those things are indeed real.

Unless you're talking about abstract information processing which can definitely happen for real in a sim. Which we are.
 
Would you say it requires more organization than people writing 1's and 0's on paper? Could consciousness arise from that?

It depends on the rules used to write the 1s and 0s. Randomly writing down 1s and 0s? No. If the rules were sufficiently complicated, involved input from an environment of some sort, and what one person wrote down affected what other people wrote...then you could have some sort of extremely slow consciousness, I believe.

This is no different than saying you could create an intelligence by having a few billion people each act like one brain or nerve cell, and carry out what that cell would do including receiving and transmitting information to other cells, forming new connections, and receiving sensory information from the environment.
 
Last edited:
You realize that argument was supporting the similarity of abstract information processing between brains and sims that your argument requires to be false, right?

I understand that's how you intended it, yes.

But if you follow the evidence and logic far enough you'll see where that falls apart.
 
One might do much more than simply suggest that.

Indeed. One might even turn around and say that dismissing the recreation of information processing in a simulation by pointing out that you can't recreate something more than information processing in a simulation is rather disingenuous.
 
Indeed. One might even turn around and say that dismissing the recreation of information processing in a simulation by pointing out that you can't recreate something more than information processing in a simulation is rather disingenuous.

You do realize that information processing is an abstraction, right?

It's a way we can manage to talk about things that are too complex to talk about otherwise.
 
I understand that's how you intended it, yes.

But if you follow the evidence and logic far enough you'll see where that falls apart.

Seriously? Is this the level you really want to run in?

Nuh-uh! If you follow the evidence and logic far enough you'll see where your argument falls apart.

Can we please skip these in the future?
 
You do realize that information processing is an abstraction, right?

It's a way we can manage to talk about things that are too complex to talk about otherwise.

Our brains process information. The manner in which they do so either is or produces consciousness, depending on your word usage.

You agree that a machine can process information in the same way and thus would produce consciousness.

Simulations can also process information, but for some reason you think that because it is not a machine or a brain it cannot do so in the same way and thus produce consciousness.

Your arguments for why it cannot process information have been that it cannot speed like a racecar or generate electricity like a powerplant or become wet like water. All of which are completely true, of course, but they are equally irrelevant to whether it can or cannot process information. Which it can.
 
Last edited:
Ok, what's the difference between an imaginary conscious entity and a real conscious entity?

It's precisely the same difference between an imaginary power plant and a real one.

The imaginary power plant has no walls, doesn't occupy any land, produces no power, etc. It's just a collection of abstractions -- such as patterns of light or ink which make us think things like "the power plant is overloaded" or "the changes to the power plant make it more efficient" even though there's nothing to actually overload or to be more or less efficient.

The real power plant produces energy, uses resources, takes up real space.

A real conscious object takes up space and is conscious in 4-D spacetime. A simulated one simply makes us think things like "now he's going to sleep" even though we're simply looking at a machine that's built to run simulations but not to be conscious.

(It continues to amaze -- and depress -- me that this simple point has to be discussed at all, much less as nauseam.)
 
Our brains process information.

No, they do not.

We talk about them that way, but information isn't real.

If it is real, I'd love to know how much this information in our brains weighs, or what its wavelengths are.
 
You agree that a machine can process information in the same way and thus would produce consciousness.

No, I do not.

To be conscious, a machine will have to engage in sufficient physical processes to mimic the behavior.

Same as anything else you want to mimic.
 
No, they do not.

We talk about them that way, but information isn't real.

If it is real, I'd love to know how much this information in our brains weighs, or what its wavelengths are.

Well then consciousness isn't real either, but I don't see that as a useful way to address things.
 
Seriously? Is this the level you really want to run in?

Nuh-uh! If you follow the evidence and logic far enough you'll see where your argument falls apart.

Can we please skip these in the future?

Probably we can. I just get tired of the merry-go-round sometimes.
 
You say "not so", but you aren't actually contradicting what I said. Duplicating mathematical relationships, which I'm not even sure is an accurate description of a simulation, is still not the same as duplicating real objects.

It is an accurate description of a simulation. They all attempt to duplicate the relevant relationships as much as possible.

Sure, it isn't the same as duplicating real objects. Replacing a mechanical phone line routing system with an electrical one is also not duplicating the mechanical device. That doesn't mean it doesn't do the same job.

Even if that's the case, we'd have the difference between a perfect description/depiction of a thing and a thing itself.

And that's relevant how? We don't need an exact duplicate, only one that copies the essential behavior of the original.

No, I'm saying representing the mathematical relationships of something isn't the same as duplicating it. To interact like a human, the machine needs a body. We can't just simulate its legs. We can't just simulate its eyes. We can't just simulate its nerves and muscles. But we can just simulate its brain?

Does a quadriplegic not qualify as human? Are they sub-human if they are also blind? You overstate the need for a body a bit. We all agree they'd need some sort of input from an environment, and the ability to interact with that environment. But saying it needs to be "just like a human body" is silly. You'd seem to be dismissing people like Helen Keller.

Are muscles required to be conscious? I think not, and you'd need a pretty good argument to claim otherwise.

No one is proposing just duplicating the brain without some sort of input and output is the way to go. You can go two routes with input and output though. One, you can have a simulated environment. Two, you can provide an interface to the outside world (speakers, microphone, camera, etc). Either one would do.

Not sure what you mean.

I mean precisely what you are seeming to argue above. That saying you need legs, or eyes, or the like to be conscious is silly. You probably need some sort of environment to interact with at least for some part of your life, but demanding all 5 senses, muscles, a skeleton, and the rest is pretty ridiculous. Consciousness clearly resides in the brain, not the eyes, not the muscles, not the ears, not the arms, not the legs, not the heart.

I think the former. But no, the simulation will not perfectly replicate the physical interactions of the brain. We can distinguish between the two. Representing something is not the same as recreating it.

First, you don't need "perfect" you need "good enough." Brain damage, variances in brains, mental problems, and a host of other things is evidence enough consciousness is pretty resilient. Heck, as far as we are aware you can't get rid of it without literally rendering someone brain dead. So if you make a model that duplicates the interactions of the brain within .001%, then there's no reason to think it wouldn't be conscious (again, assuming you give it some inputs and outputs for information).

We can't model things just with math if model means "recreate".

It depends on what part of the system we want to recreate. If we want to recreate the part of the brain that takes in signals on nerves and outputs signals on nerves, and does so in particular fashion, we are perfectly capable of doing that in theory. We can even add eyes and the like to it. Sure, you won't have "gray matter", but are you really saying that color or how a brain feels to the hand is where consciousness lies? It isn't in how it sends signals that operate the tongue, mouth, jaws, and vocal cords producing words? It isn't in how it interprets written scribbles into sensible language? It isn't in how it takes in the touch of a loved one and feels soothed? This is ALL about taking in information and processing it, not about being sticky, or a particular color, or whatever.
 
Well then consciousness isn't real either, but I don't see that as a useful way to address things.

The difference, of course, is that consciousness is a behavior. Which we can measure in terms of resource usage, for instance.

We don't imagine it's a thing.

But you seem to be treating information as a thing rather than an abstraction when you say that our brains "process information".

That's a very fundamental entification error.

If you make the superficial comparison between a computer and a brain as "information processors" and don't stop to remind yourself that neither one actually does process any "information" then you're going to come to all sorts of wrong conclusions.
 
If a computer runs an accurate simulation of the heat death of the universe, will it cause the computer to stop working?

It will cause the simulated universe to stop work.

A simulation that is an accurate representation of the big bang will start a simulated universe to start working.

A simulation that is an accurate representation of a mechanic cypher will encode information.

A simulation that is an accurate representation of a PS1 will run playstation games (that are on simulated CDs, of course...or real CDs if you have it translate information from your CD drive).

A simulation that is an accurate representation of a brain with input and output will take in stimulus along simulated nerves and output stimulus along simulated nerves. Feed it an education and Shakespeare via the inputs and it will simulate a changing and learning brain, and simulate emotions and feelings regarding the Bard. It would be able to simulate the creation of a term paper and simulate a real brain's reaction to emotional events. It could simulate falling in love. How would this not be conscious?
 
Status
Not open for further replies.

Back
Top Bottom