• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
Say you don't switch your computer on. Is it still processing information? In the physical sense, yes. It's heating up, cooling down - molecules are exchanging enormous amounts of information all the time.

So heat is information? And heat exchange between particles is information processing?

Oh wait, this isn't information in the mental sense, because there's no observer?

Are you, too, a dualist?

The only difference when you run your 3D rendering program is that a tiny, tiny subset of all this information flying around becomes meaningful to you.

I'm having trouble parsing this. Do you mean:

That there such a thing as meaningless information?
or
That information can exist without being meaningful to me?
 
Oh, I don't believe for a moment that it's "entirely removed from physical reality".

Look at it this way.... Consider these very real events:

  • My cousin marries his girlfriend
  • The Atlanta Braves win a baseball game
  • Eisenhower is sworn in as President

Are these abstractions, or are they events in objective physical reality (OPR)?

The former, obviously, and not the latter.

Walking down the aisle, catching a ball, putting a hand on a Bible... these are events in OPR.

But you can study the physics all you like and you'll never detect any of the events in the list up there, because they only happen because we all agree that they happen.

From where I sit, IP is in the same category.

A computer changes states, a pixel pattern appears on a screen. That happens in OPR.

But "adding numbers" only happens because we agree that it did. It's an abstraction. Entirely tangled up with the OPR events, but nevertheless an abstraction.

I'll ask again: what is information processing an abstraction OF?

(Or to be grammatically correct: OF what is information processing an abstraction?) :p
 
"Your brain is (in everything it does) a computer".

Hmmm. This is interesting. I followed the exchange between Piggy and PixyMisa, and here's my input.

"A brain is functionally the same as a computer" makes some literal sense to me, but I essentially see a computer's "brain" and a human brain as like each other in analogy.

An apple will power a human body and electricity will power a computer, so they can be said to be functionally the same. Certainly the raw material of the apple, eventually, participates in the sending of electrical signals.

But there's always a relationship in nature between form and function. Unless a computer has exactly the same form as a human brain, it will never function exactly as one -- just as an apple cannot be made to function exactly like a supply of electricity.

Of course, analogous functioning can be impressive and totally convincing. One day there might be a SAI android capable of passing in society as a human and imminently capable of performing the same kinds of mental tasks as any human.

If that's true, then why distinguish between same-functioning and analogous-functioning? The main reason I see to do that is that it prevents us from making mistakes in understanding AI. If an AI brain isn't structurally the same as a human brain, it will be subject to different kinds of (even if similar) problems.

For instance, an AI computer brain might develop a kind of mental illness that a human could never have (strictly based on the difference in forms).
 
Last edited:
So heat is information? And heat exchange between particles is information processing?

Oh wait, this isn't information in the mental sense, because there's no observer?

Are you, too, a dualist?

There's a physical definition of information. There's an IT definition of information. The two are different. That's not dualism, that's the way it is.

I'm having trouble parsing this. Do you mean:

That there such a thing as meaningless information?
or
That information can exist without being meaningful to me?

It depends what you mean by information. I fail to see why the computer should have the same POV as a human being as to what is meaningful information. It doesn't care about the plans for your new house. It might be more worried about the temperature of the disc drive.
 
This is incorrect.

I have taken the position from day one that everything has subjective experience, including individual particles.
Cramer's Transactional Interpretation might give an ontologocal materialist support for that supposition. I don't agree it does.

I tend to agree with your statement in the most basic sense, but don't think that subjective experience means much until the complexity of what we term 'life' is available. Where prions, virusues, etc. fit is unknown and likely unknowable.

But people still reject that because they can't wrap their head around the idea that the subjective experience of a particle is vastly different from that of a human, and so in fact the label "subjective experience" is meaningless.
Perhaps. Unfortunately my subjective experience is all I have to work with. What do you do?

That is the problem -- people think all subjective experience must be like a human's, and that just isn't true. So what good does it do to throw the term around in the first place?
Huh? What ever gave you that idea?

For instance, do you think the subjective experience of a dog is even remotely similar to ours? Or a bird? Where do you draw the line in "sameness?" How would you even know where to draw that line?
For humans, at other humans; dogs, other dogs, etc. You already mentioned there is no line separating 'the subjective' from any existent.


Our current reality is a state of profound ignorance about consciousness. If we are honest about that ignorance, it's not wishy-washy. (Can't tell if you had a failure in parsing or a failure in expressing.)
You mean "it's SRIP" doesn't do it for you either?

... if you have a quibble with epiphenomenalists, go argue with them.
I have no quibble with some epiphenomenal positions.

Third, behavior is the only way we could possibly tell whether something else has subjective experience. Unless you know of some other way, in which case I'd love to hear about it.
I do not, nor do I think that will ever be possible. That does not stop me, or you, from noting that our subjective experiences are all we will ever have.

I'm just curious: do you think it's possible (in principle, not necessarily NOW) for a non-human computer to have subjective experience?
In any meaningful-to-humans way, not unless it is alive.

Of course I could be wrong and PixyMisa's hard materialist approach is correct. From his viewpoint he has no other acceptable course.

One could choose dualism and admit the illogic of expecting magic, or slightly better, choose what we could call idealism, realizing that position appears to entail magic as examined by current state-of-the-art science.
 
Hmmm. This is interesting. I followed the exchange between Piggy and PixyMisa, and here's my input.

"A brain is functionally the same as a computer" makes some literal sense to me, but I essentially see a computer's "brain" and a human brain as like each other in analogy.

An apple will power a human body and electricity will power a computer, so they can be said to be functionally the same. Certainly the raw material of the apple, eventually, participates in the sending of electrical signals.

But there's always a relationship in nature between form and function. Unless a computer has exactly the same form as a human brain, it will never function exactly as one -- just as an apple cannot be made to function exactly like a supply of electricity.

Of course, analogous functioning can be impressive and totally convincing. One day there might be a SAI android capable of passing in society as a human and imminently capable of performing the same kinds of mental tasks as any human.

If that's true, then why distinguish between same-functioning and analogous-functioning? The main reason I see to do that is that it prevents us from making mistakes in understanding AI. If an AI brain isn't structurally the same as a human brain, it will be subject to different kinds of (even if similar) problems.

For instance, an AI computer brain might develop a kind of mental illness that a human could never have (strictly based on the difference in forms).
A given computer need not and (as I explained explicitly and repeatedly to Piggy) usually does not behave the same as a given brain at all.

But the things that a brain can do in principle are identical to the things that a computer can do in principle.

So if you accept that consciousness is produced by the brain, it necessarily follows that you accept that computers can be conscious.

That leaves us with how the brain produces consciousness, and I've presented my (as in, really Hofstadter's, Dennett's, Ramachandran's et. al.) ideas on that.
 
There's a physical definition of information. There's an IT definition of information. The two are different. That's not dualism, that's the way it is.

I call BS. Show me the physical definition of "information".

[ETA: Not that there isn't one; I but I want to know how you can include "heat" as "information".]

It depends what you mean by information. I fail to see why the computer should have the same POV as a human being as to what is meaningful information.

I am not--and I think I'm not alone here--arguing that a computer has the same understanding of what's meaningful information as a human. Heck, two people quite often have different understanding about what's meaningful. What's your point?

It doesn't care about the plans for your new house. It might be more worried about the temperature of the disc drive.

What position are you arguing exactly? Sounds like you're ascribing subjective experience to the computer.
 
Last edited:
You mean "it's SRIP" doesn't do it for you either?
I think it's a fine attempt at a definition. But as far as its worth as an explanation, well that depends on the audience, now doesn't it. As an explanation, I find it a really powerful first step.

In any meaningful-to-humans way, not unless it is alive.
Isn't this just shifting the terms of the debate? What does it mean to be alive?

Of course I could be wrong and PixyMisa's hard materialist approach is correct. From his viewpoint he has no other acceptable course.

One could choose dualism and admit the illogic of expecting magic, or slightly better, choose what we could call idealism, realizing that position appears to entail magic as examined by current state-of-the-art science.

Now I'm having a hard time understanding exactly which position you take. I understand that "It's SRIP" is not a satisfactory explanation for you.

Do you think that a non-human computer can in principle have subjective experience?
 
I call BS.
Oh, you shouldn't have done that.
Show me the physical definition of "information".

[ETA: Not that there isn't one; I but I want to know how you can include "heat" as "information".]
Physical information includes all the properties of a system. Including heat.

I am not--and I think I'm not alone here--arguing that a computer has the same understanding of what's meaningful information as a human. Heck, two people quite often have different understanding about what's meaningful. What's your point?

I mean that when people ascribe to computers the potential (or actual) capacity for consciousness/thought/self-awareness, they tend to only consider matters of interest to humans. Why should the processes specifically aimed at providing information to human beings be the very things which just happen to produce self-awareness, while the other physical processes going on do nothing whatsoever.

What position are you arguing exactly? Sounds like you're ascribing subjective experience to the computer.

For humorous purposes only, I assure you. I don't think the computer is interested in its own disc drive, and I don't think it cares about the plans for the extension to your house (elegant though it may be). There are many physical processes going on in a computer, and I don't believe that any of them are particularly privileged in terms of what they produce.

I asked a long time ago for a physical definition of SRIP that would include the running of computer programs and exclude everything else. Didn't get it. I still stand by what I said back then - the unplugged pile of components is processing information - in a physical sense - just as much as the one running programs.
 

The wikipedia entry doesn't support your assertion.

You are equivocating between physical entities (or processes, in the case of heat) and facts about those entities or processes. This is a result of a category error or a lack of distinction between map and territory.

I mean that when people ascribe to computers the potential (or actual) capacity for consciousness/thought/self-awareness, they tend to only consider matters of interest to humans.
Which people are you referring to here?
Why should the processes specifically aimed at providing information to human beings be the very things which just happen to produce self-awareness, while the other physical processes going on do nothing whatsoever.
I don't know. Who makes this claim?

For humorous purposes only, I assure you. I don't think the computer is interested in its own disc drive, and I don't think it cares about the plans for the extension to your house (elegant though it may be).
Again, nobody is claiming this.

There are many physical processes going on in a computer, and I don't believe that any of them are particularly privileged in terms of what they produce.
But there is such a privileged process in humans? What is it?

I asked a long time ago for a physical definition of SRIP that would include the running of computer programs and exclude everything else. Didn't get it.
This request is as meaningful as: give me a non-physical definition of consciousness that excludes everything else.

I still stand by what I said back then - the unplugged pile of components is processing information - in a physical sense - just as much as the one running programs.

An unplugged pile of components might be dissipating heat, but it's not processing information, much less doing it "just as much" as the running computer.

Lets use the wikipedia article you cited. Physical information is the set of all properties of a thing which allow us to identify it from among other things. As the pile of components loses heat and begins to decompose, it gradually loses information. As it decomposes, we become less and less able to distinguish it from the dirt, detritus, and other particles surrounding it. Eventually, it becomes dust.

Now, is entropy a form of information processing?
 
I do not, nor do I think that will ever be possible.

I am absolutely certain that it will eventually be possible (if we don't blow ourselves up too soon).

All that's required is to figure out how the brain produces conscious experience (no small task) then see if the brains of other species have that same mechanism or a functionally analogous one.

And at that point, we'll also finally be able to design sentient machines.
 
Yes. And it's the same thing I'm talking about, albeit with specific neurological data - the brain examining its own processes, i.e. self-reference.

But Ramachandran is explaining self-awareness with mirror neurons. That is the sort of physical mechanism I am looking for. You are just saying that self-awareness is a property of SRIPs, and I am saying that only SRIPs with a mechanism akin to mirror neurons are self-aware.
 
In any meaningful-to-humans way, not unless it is alive.

I don't see why it would have to be alive. As long as the machine has the right mechanisms, I can't see it would make any difference.
 
I am saying that only SRIPs with a mechanism akin to mirror neurons are self-aware.

Which is pretty much what I'm saying, too -- there must be an additional mechanism on top of the IP, for the same reason that IP alone can't focus the retina or move a muscle.

Sofia is a behavior, not information.
 
Sofia is a behavior, not information.

Your strawman is turning into a pile of straw-powder. No one is claiming that consciousness is information. Everyone agrees that consciousness is a process.
 
Last edited:
The wikipedia entry doesn't support your assertion.

You are equivocating between physical entities (or processes, in the case of heat) and facts about those entities or processes. This is a result of a category error or a lack of distinction between map and territory.

The facts about the entities are properties of those entities. If information is a physically real concept, then it means the total properties of a system.

I don't know if the concept of physical information is a particularly useful one. However, if we are discussing the physical effects of information processing, then it's what we have to go on.

Which people are you referring to here?

I don't know. Who makes this claim?


Again, nobody is claiming this.

It's always been claimed by the Strong AI advocates that it's the programs running in the computers that produce the consciousness, not the totality of the physical processes. It's never been claimed that arbitrary physical interactions produce consciousness, or can produce consciousness. I'm not constructing a strawman here.

But there is such a privileged process in humans? What is it?

As I've said repeatedly, we don't know. I do think that if someone is claiming that a particular activity produces consciousness - which Pixy has stated consistently - then that activity should be physically defined. Hence the hand-waving reference to "switches".

This request is as meaningful as: give me a non-physical definition of consciousness that excludes everything else.

I'm not making the claim that computers create consciousness by running programs. It's up to those who do to state precisely how this happens.

An unplugged pile of components might be dissipating heat, but it's not processing information, much less doing it "just as much" as the running computer.

Lets use the wikipedia article you cited. Physical information is the set of all properties of a thing which allow us to identify it from among other things. As the pile of components loses heat and begins to decompose, it gradually loses information. As it decomposes, we become less and less able to distinguish it from the dirt, detritus, and other particles surrounding it. Eventually, it becomes dust.

It will probably be a lot harder to give a full description of the dust than of the pile of components.

Now, is entropy a form of information processing?

It seems a valid physical definition. If you have a better I'd like to hear it.
 
But Ramachandran is explaining self-awareness with mirror neurons. That is the sort of physical mechanism I am looking for. You are just saying that self-awareness is a property of SRIPs, and I am saying that only SRIPs with a mechanism akin to mirror neurons are self-aware.

He's not explained it. He's seen the mirror neurons, and he's speculating that they might have something to do with consciousness. If he has a theory, what is it?
 
Status
Not open for further replies.

Back
Top Bottom