• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
...snip...

Now, when I hear that, I assume that if this is true then the two must be similar functionally or structurally, if they are to belong to the same category of thing.

...snip...

Almost just as an aside, we call computers computers because that is what we used to call people who did the tasks that a computer does today.
 
Ok, great. So maybe you can answer the question I put to PixyMisa: If the output of IP is information, and consciousness is not information, then how is it possible for IP alone to generate consciousness?

Well , it appears to me that the end process of the brain is to have a model of reality based upon the sensation that is of some sort of pragmatic value.

So the end result is something that allows the organisms to preform various tasks,
 
Not really. Their brains still have a lot of development left to do. Much of their higher-level judgment, impulse control, emotional control, and ability to think through consequences and consider long-term ramifications has not yet come online.

But it's only quite recently that brain studies have determined this. It seems obvious, of course, in a way, because we all know that teens are emotional, impulsive, and myopic, but it's only in the last few years really that we've had evidence that this isn't due to lack of experience or cultural influences, but rather due to a lack of brain development in critical areas.

:) Pituitary hormones don't help either.

Some of it is also learned behaviors. :)

I figure it seems people 'mature' at about 25, when the hormones kick way down. (Anecdote solely)
 
I don't think that anyone is arguing that consciousness is definitely not produced by some kind of physical process in the brain. The argument seems to be between the computationalists (I've referred to them as Strong AI proponents) who claim that the execution of an algorithm is both necessary and sufficient to create the conscious experience.

So we use the term, compuation differently, I think of teh brain as a very messy, sloppy, chaotic place.

I know Pixy uses a defeintion of computation which is evry broads.

I prefer splunge.

 
This is not accurate, tho.
But it is.

This is what Hofstadter and Ramachandran and Dennett are talking about - consciousness is the self-referential process from the point of view of the process itself.

Information is being processed all the time.
Certainly.

We are conscious of some of it, not conscious of the rest.
Certainly. There is no requirement for the whole brain to be involved in the consciousness loop - and it isn't. There is no requirement that there only be one consciousness loop in the brain, and there is no requirement that the conscious loop be self-consistent - and, as Ramachandran notes with several examples in that second article, it isn't.
 
And it is a wiki page, so which professionals use that broad defintion? Hmmmm, and how.

If you have a better definition of Physical Information then feel free.

If we are discussing the physics of information transfer, then it really doesn't matter what they layman's understanding of the word is. We presumably want to know how the physics works.

Incidentally, the reference was originally given by Pixy. One might almost suppose that he hadn't read it...
 
So we use the term, compuation differently, I think of teh brain as a very messy, sloppy, chaotic place.

I know Pixy uses a defeintion of computation which is evry broads.

Pixy regards computation as something that can be done on a Turing Machine. He's stated a number of times that a Turing Machine can do anything the brain can do.
 
It occured to me last night that people aren't always conscious in the same ways to begin with.

For example, if I am at the theatre and I am really paying attention to the movie (which implies I am not paying much attention to anything else) am I conscious of my current self? Of the environment? Of anything besides the information being conveyed to me by the photons coming from the screen and the sound waves coming from the speakers?

Part of the struggle I think many people have with this issue is that for some reason they lump all aspects of the conscious experience into the same term "consciousness" and really miss important fact that "consciousness" isn't one thing in humans. They say "oh a computer can't be conscious because it can't write poetry" or something stupid like that -- well are you writing poetry when you watch a movie?
 
Its rather revealing that strong AI supporters only discuss consciousness in terms of intelligence.
The human experience of consciousness however is much more than intelligence.
Is art a form of intelligence like mathematics?
Perhaps Pixy and his mates are right since they are only talking intelligence and a simple pocket calculator can do math.
What we need is a definition of consciousness which encompasses more than just intelligence.

It is rather revealing that strong AI opponents always resort to discussing consciousness in terms of painting masterpieces and writing sonnets and feeling love.

Are you claiming that a human who is unable to do such things is necessarily not conscious?

Are you?

I presume you are not claiming that. So just stop bringing up such an utterly stupid argument.
 
Yes - but the claim is that the experience produced by the program is exactly the same regardless of the substrate. The only effect the physical interactions of the computer have is to allow the program to run. (Incidentally, there are many other physical interactions taking place on the computer which do not enable the program to run. They are independent of its operation).

Functionalism says that an algorithm, given the same inputs, will always produce the same outputs, regardless of the substrate it is instantiated upon.

Dennet, and others, claim that p-zombies are not possible, meaning that you can't have something that acts conscious in all ways but has no subjective experience.

You seem to be conflating these two claims to say that algorithms operating in different substrates will always have exactly the same subjective experience. I'm not sure this exact claim is made by SAI advocates. If it is, I'd be interested to see it.
 
It occured to me last night that people aren't always conscious in the same ways to begin with.

For example, if I am at the theatre and I am really paying attention to the movie (which implies I am not paying much attention to anything else) am I conscious of my current self? Of the environment? Of anything besides the information being conveyed to me by the photons coming from the screen and the sound waves coming from the speakers?
The MIT Introduction to Psychology lecture series covers this. :)

Part of the struggle I think many people have with this issue is that for some reason they lump all aspects of the conscious experience into the same term "consciousness" and really miss important fact that "consciousness" isn't one thing in humans. They say "oh a computer can't be conscious because it can't write poetry" or something stupid like that
And then, when you present them with poems written by computers, they really start to get desperate.
 
You seem to be conflating these two claims to say that algorithms operating in different substrates will always have exactly the same subjective experience. I'm not sure this exact claim is made by SAI advocates. If it is, I'd be interested to see it.
Yes, they would. An algorithm is, by definition, perfectly repeatable and platform-independent.
 
Functionalism says that an algorithm, given the same inputs, will always produce the same outputs, regardless of the substrate it is instantiated upon.

Dennet, and others, claim that p-zombies are not possible, meaning that you can't have something that acts conscious in all ways but has no subjective experience.

You seem to be conflating these two claims to say that algorithms operating in different substrates will always have exactly the same subjective experience. I'm not sure this exact claim is made by SAI advocates. If it is, I'd be interested to see it.

You may be right - I'm going by the arguments I've had on this subject on JREF. Piggy, for example, considers himself a Strong AI advocate, but doesn't necessarily accept this. However, when Pixy claims that a Turing machine can do anything that a brain can do - and also claims to have proved it - the above seems to imply that it can create the exact same subjective experience.

There are obviously a wide range of opinions on this subject. The main thing I disagree with is the contention in the OP.
 
Pixy regards computation as something that can be done on a Turing Machine. He's stated a number of times that a Turing Machine can do anything the brain can do.

The first claim is incontrovertible. Unless, as someone tried to do earlier in the thread, you define computation as something only a human can do.

The second claim can't be right, for the trivial fact that the brain is a physical organ that does things like disperse and uptake neurotransmitters--which an algorithm doesn't do.

What Pixy is claiming, I think, is that if we know the exact functions of neurotransmitter dispersal and re-uptake on the IP the brain does, then we can model those functions in an algorithm, such that the effects on the algorithm's IP are identical to what the brain does.

Essentially, it might be more clear to say that the algorithm can do anything that the mind can do--with the understanding that the mind is the algorithm running on the hardware of the brain.

I'm sure Pixy will correct me if I'm wrong. :D
 
What Pixy is claiming, I think, is that if we know the exact functions of neurotransmitter dispersal and re-uptake on the IP the brain does, then we can model those functions in an algorithm, such that the effects on the algorithm's IP are identical to what the brain does.
You don't even need that. You can just model it at the molecular level and let the molecules do their stuff. You don't need to know how the brain works to simulate it.

Essentially, it might be more clear to say that the algorithm can do anything that the mind can do--with the understanding that the mind is the algorithm running on the hardware of the brain.

I'm sure Pixy will correct me if I'm wrong. :D
The more general point is that consciousness is informational. That means that it is long-since proven to be substrate-neutral.

If you correctly simulate a brain in full detail at whatever level you choose, it will be conscious.
 
Yes, they would. An algorithm is, by definition, perfectly repeatable and platform-independent.

I'm not disputing that the algorithm is platform-independent. I've just never heard that the algorithm is necessary and sufficient to determine all properties of subjective experience.

I think I'm just making a slightly weaker claim than you are--from a position of profound and possibly permanent ignorance of the qualitative details of another's subjective experience.

I guess I'm (possibly Sydney Shoemaker before me?) positing something like this:

Soft = software
Hard = hardware
Bhvr = conscious behavior (behavior that leads others to think an entity is conscious)
Exp = conscious experience (the private subjective "what it's like" for an entity)

Soft A + Hard A ==> Bhvr A + Exp A

Soft B + Hard A ==> Bhvr B + Exp B

Soft A + Hard B ==> Bhvr A + Exp C

Soft B + Hard B ==> Bhvr B + Exp D

In other words, identical software run on any hardware will always result in identical behavior. (Substrate independence claim from functionalism.)

The combination of software and hardware determines a possibly unique subjective experience. It's not clear to me that subjective experience needs to be fully determined by software. But--this does entail epiphenomenalism, which I intuitively object to, while not having a strong argument against.
 
Status
Not open for further replies.

Back
Top Bottom