Are You Conscious?

Are you concious?

  • Of course, what a stupid question

    Votes: 89 61.8%
  • Maybe

    Votes: 40 27.8%
  • No

    Votes: 15 10.4%

  • Total voters
    144
Can this machine, which is capable of learning from the behaviour of humans it picks up with its various sensors (but does not "observe" them, because "there's nobody home"), learn the meaning of the word "consciousness"?

Can we?
Sure.

What is the meaning of the word consciousness?

We know the meaning of the word because of our intimate, first-hand experience of consciousness, regardless of whether we can express it to the satisfaction of the sophists and pedants.

Note that I'm not suggesting that there's anything magical or mystical about consciousness.

The question is, could this computer devise an experiment which tested the efficacy of various methods of alleviating depression in humans?

I would say it could.

I hope you're right.
 
Last edited:
If you had no sense of taste, could you know what a peach tasted like simply by inspecting a scan of someone's brain?

Wait a minute. This can't be the same Robin who had such a fit over "Mary's Room".
 
We know the meaning of the word because of our intimate, first-hand experience of consciousness, regardless of whether we can express it to the satisfaction of the sophists and pedants.
Well, no. If you can't define a word, you don't know what it means.

I'm not saying that we don't know what it means, just that your argument is insufficient.
 
I'll repeat a question...

Imagine we build a computer which can pretty accurately replicate all the cognitive functions of a human brain - that is it can carry out all of the computations and does it in roughly the same way.
Sure.

The computer is not conscious.
Wrong.

It has no internal awareness of anything.
Wrong.

But it is perfectly capable of learning language and ends up with a complex set of concepts based on an internal model it has built of physical reality. So it knows what atoms and stars and humans are, and it can understand verbs concerning behaviour.
You can't do that without it being conscious. It's not computationally possible. You run head first into a combinatorial explosion of information, and the only way to prune it back is to introduce self-reference. That's the fundamental fallacy of Searle's Chinese Room - the room cannot possibly exist in the first place, not even in principle.

Can this machine, which is capable of learning from the behaviour of humans it picks up with its various sensors (but does not "observe" them, because "there's nobody home"), learn the meaning of the word "consciousness"?
The computer is conscious.
 
Wait a minute. This can't be the same Robin who had such a fit over "Mary's Room".
I didn't have a fit over Mary's room - I just pointed out the flaw in the logic - you were using two different standards for the term "complete knowledge".

And I haven't said anything contradictory here - I have been using this example to illustrate what I mean by conscious experience for years in this forum.

If she can start to experience colour Mary doesn't learn about a new experience - she already knew about it. But she experiences it for the first time.
 
If you had no sense of taste, could you know what a peach tasted like simply by inspecting a scan of someone's brain?
I could know that a sense of taste existed by inspecting a scan of someone's brain.

I could know that this someone enjoyed the flavour. I could learn that they associated the tate of a peach with other sensations and memories.

With enough experimental data on the workings of my own brain and that of the peach-eater, and a suitably detailed theory of perception, I could induce corresponding sensations in my own brain.

So, in essence, yes. But your question is almost entirely beside the point, and the only thing necessary is the very first sentence: I could know that a sense of taste existed. We may still describe it as a private behaviour, but we can observe it objectively. This is true for everything the brain does - and for everything in general.
 
Fine then tell me the meaning.
We know the meaning of the word because of our intimate, first-hand experience of consciousness, regardless of whether we can express it to the satisfaction of the sophists and pedants.
Experiencing something and knowing the meaning of it are different things.

Whether or not you can express you can express it to the satisfaction of sophists and pendants, can you express it better than UE's computer could? Could you say anything about consciousness that the computer could not, besides the fact that you are experiencing it?
 
I could know that a sense of taste existed by inspecting a scan of someone's brain.
Of course
I could know that this someone enjoyed the flavour. I could learn that they associated the tate of a peach with other sensations and memories.
Of course
With enough experimental data on the workings of my own brain and that of the peach-eater, and a suitably detailed theory of perception, I could induce corresponding sensations in my own brain.
Of course - and this would be essentially to undergo the experience.
So, in essence, yes.
In essence and in fact - no. You could not know what it was like unless you actually underwent the experience.
But your question is almost entirely beside the point, and the only thing necessary is the very first sentence: I could know that a sense of taste existed.
That is entirely besides the point.
We may still describe it as a private behaviour, but we can observe it objectively. This is true for everything the brain does - and for everything in general.
But you could not know what it was like unless you actually had the experience - and this is why it is termed private, or subjective.
 
Sure it does, since the fMRI scan does nothing but demonstrate meaningless, undecodable, indecipherable activity to any specific private behavior, and most specifically regarding awareness of awareness.

...snip...

We already have "bionic" limbs that people can use by just "thinking" about using them, so we do have electronic systems that can already decode "my awareness of wanting to move my hand".
 
Of course

Of course

Of course - and this would be essentially to undergo the experience.

In essence and in fact - no. You could not know what it was like unless you actually underwent the experience.
That's only because you've defined it that way.

Say you've never seen the colour orange. If I tell you that it's something like red, and something like yellow, that will give you a pretty good idea what the experience is like.

But you've defined knowing what the experience is like as having had he experience. So your argument is at best a tautology.

That is entirely besides the point.
And that is only because you've decied that a different point is now the point.

But you could not know what it was like unless you actually had the experience - and this is why it is termed private, or subjective.
Yes. This is why simile and metaphor and analogy are utterly absent from human communication.

Oh, wait...
 
...snip...

But you could not know what it was like unless you actually had the experience - and this is why it is termed private, or subjective.

We know it is entirely possible to induce "experiences" by direct manipulation the brain, so in principle why wouldn't it possible to mimic the changes that occur in the brain when someone is tasting a peach (which involves many, many other things than just the brain) without ever tasting (as we usually use the word) a peach?
 
I've mentioned this in the past but I once had an x-ray scan that required me to be injected with a dye that was opaque to x-rays, one of the side-effects of the dye they used on me is that you get a "taste in the back of your throat", that taste is described as "rusty nails" - I'd never tasted rusty nails yet that description matched my "private experience".
 

Because it would be a phenomenon. Phenomena are there to be explained.

It doesn't matter if consciousness permeates the entire universe at all times, or if it is a momentary fluctuation at a single point. If it happens, it is something in the list of things that physics has to explain. And there's no way of knowing its relative importance until it is explained.
 
Really?

Then define it. :p


Easy. Consicousness is a skill or ability similar to walking or playing the piano. It isn't a "thing", it isn't a property, it isn't a process, and it certainly doesn't exist separate from a being that can learn that skill or display the usage of it any more than piano playing can exist without a pianist. Technically, consciousness as most people use the term is two skills: the ability to distinguish between "me" and "not me", and the ability to recognize that ability in others. The reason consciousness becomes such a bone of contention in philosophy is that it is directly tied in with morality.

The ability to recognize consciousness in others can only be based on what Mercution calls public behavior. If an object does not display the public behavior I have learned to associate with consciousness, I consider it to be not conscious. So for example, a rock does not display the ability to recognize between itself and others, so I consider it to be not conscious. My dog does display this behavior, so I consider it to be conscious. A sleeping person doesn't display the ability to distinguish between "me" and "not me" at that moment, so he or she is temporarily not conscious.

Because of this, I consider a rock to be not conscious and therefore actions taken with or against a rock are amoral (without moral consequences). I consider my dog to be conscious that therefore actions taken with or against my dog can be moral or immoral, but not amoral. With this in mind, it is fine to pound a tent stake in with a rock, but not to do so with my dog.

Like most skills, this has to be learned and some people/animals/things do not seem to be able to learn it as well as others. This is why you may see the occasional psychopath (or two year old ;)) trying to pound tent stakes in with the family dog.

Computer programs tend to fall in a gray area with some people recognizing the computer's ability to distinguish between "me" and "not me" and others failing to do so. I believe this has to do with people having varying levels of skill, maybe in a few cases having too much talent for recognizing that ability in others, such as those who subscribe to the "everything is conscious" belief.
 
Bah.

I need to use exactly zero words in order to know what "consciousness" means.

YMMV

Philosophically, the difference between the thing, the word indicating the thing and the possible definition of the word are very distinct. It isn't always possible to have a precise definition. Prizing the definition over the reality is being two steps away from what is important.

Of course it's possible to produce a precise definition for consciousness, but at the cost, in most cases, of producing a definition of something else.
 

Back
Top Bottom