• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Hard Problem of Gravity

That's pretty vague. My computer can access my printer but I don't think that's what we mean by "consciousness". The computer is aware of the printer, but is not conscious as a result. Self-awareness, however...

Just check out a few modern neuroscientific papers on consciousness, Belz. You'll soon grasp the meaning. For example Converging Intracranial Markers of Conscious Access.

Once you start talking about self-awareness at a neuronal level of investigation you have to define self at a neuronal level. Fancy that one?


Feelings are thoughts, which are actions. So, fail.

Thoughts are often termed "inner speech" in cognitive neuroscience. See, for example, Baars - In the Global Workspace of Consciousness. Feelings are different.

There are different ways of defining these things, for example in AI, I think, but I'm personally happy with neuroscience here.

Nick
 
Once you start talking about self-awareness at a neuronal level of investigation you have to define self at a neuronal level. Fancy that one?

I'm not sure we're using the same definition of "self", here. By "self", I mean the thing having the awareness, not some cartesian theatre observer.

Thoughts are often termed "inner speech" in cognitive neuroscience. See, for example, Baars - In the Global Workspace of Consciousness. Feelings are different.

How different ? Feelings are actions, thoughts are actions. So again, fail. What is there aside from stuff and the stuff that stuff does ?
 
How different ? Feelings are actions, thoughts are actions. So again, fail. What is there aside from stuff and the stuff that stuff does ?

Thoughts seem to be structured mental elements while emotions are the valency of those elements -- the subjective weight we place on objects of thought.

For instance, two people may think about the same conceptual object: 'dog'. In one individual, the thought 'dog' elicits overall positive emotions toward said object, and with them, a higher likelihood of exhibiting positive outward behaviors toward dogs. The other person may have an overall antipathy towards the conceptual object 'dog', with a greater likelihood of expressing behaviors like the fight-or-flight response towards perceived dogs.

The most significant thing to note is that the subject may exhibit no external behaviors in response to their thoughts or the emotional weight attributed to them. Simply calling thoughts and emotions 'actions' doesn't tell us anything useful other than saying "they happen".
 
Last edited:
I'm not sure we're using the same definition of "self", here. By "self", I mean the thing having the awareness, not some cartesian theatre observer.

And, at a neuronal level, the thing having the awareness is...



How different ? Feelings are actions, thoughts are actions. So again, fail. What is there aside from stuff and the stuff that stuff does ?

The whole universe may be considered actions. So what?

Nick
 
Neurons aren't self-aware, unless I'm mistaken, so nothing.

So, when you're discussing the neuronal level of processing, defining consciousness as self-awareness is meaningless. Agreed?



So nothing. You've just answered my question. Therefore "I" is not anything else than actions. You claimed otherwise.

Belz, a couple of posts dialoguing with you and death loses its sting.

Nick
 
I think so. ;)

I haven't seen a precise definition of this usage of "think", but yes, I'd say that SHRDLU thinks.

I came across a piece by Terry Winograd, SHRDLU's creator, who seems less than optimistic...

Terry Winograd said:
Artificial intelligence researchers predict that _thinking machines_
will take over our mental work, just as their mechanical predecessors
were intended to eliminate physical drudgery. Critics have argued with
equal fervor that _thinking machine_ is a contradiction in terms.
Computers, with their foundations of cold logic, can never be creative
or insightful or possess real judgment. Although my own understanding
developed through active participation in artificial intelligence
research, I have now come to recognize a larger grain of truth in the
criticisms than in the enthusiastic predictions. The source of the
difficulties will not be found in the details of silicon micro-circuits
or of Boolean logic, but in a basic philosophy of patchwork rationalism
that has guided the research. In this paper I review the guiding
principles of artificial intelligence and argue that as now conceived it
is limited to a very particular kind of intelligence: one that can
usefully be likened to bureaucracy. Winograd (1991)

Nick
 
PixyMisa said:
I haven't seen a precise definition of this usage of "think", but yes, I'd say that SHRDLU thinks.

I came across a piece by Terry Winograd, SHRDLU's creator, who seems less than optimistic...


Terry Winograd said:
Artificial intelligence researchers predict that _thinking machines_
will take over our mental work, just as their mechanical predecessors
were intended to eliminate physical drudgery. Critics have argued with
equal fervor that _thinking machine_ is a contradiction in terms.
Computers, with their foundations of cold logic, can never be creative
or insightful or possess real judgment. Although my own understanding
developed through active participation in artificial intelligence
research, I have now come to recognize a larger grain of truth in the
criticisms than in the enthusiastic predictions. The source of the
difficulties will not be found in the details of silicon micro-circuits
or of Boolean logic, but in a basic philosophy of patchwork rationalism
that has guided the research. In this paper I review the guiding
principles of artificial intelligence and argue that as now conceived it
is limited to a very particular kind of intelligence: one that can
usefully be likened to bureaucracy. Winograd (1991)


Nick

I'm sure PixyMisa would just point out how Winograd's assessment is of his own creation is 'wrong' and 'irrelevant' :rolleyes:
 
Last edited:
So, when you're discussing the neuronal level of processing, defining consciousness as self-awareness is meaningless. Agreed?

No. That's how I always define self-awareness. The definition doesn't change. Neurons aren't self-aware, to the best of my knowledge, that's all.

Belz, a couple of posts dialoguing with you and death loses its sting.

Not my fault if you lost track of the conversation. Go back a few posts to where this started and you might get it.
 
I'm sure PixyMisa would just point out how Winograd's assessment is of his own creation is 'wrong' and 'irrelevant' :rolleyes:

I don't know anywhere near enough about AI to make an assessment. But what I see is that people like Winograd and Hofstadter are actually quite critical of the more ambitious statements of AI's potential.

For example, in I Am A Strange Loop (great book) Hofstadter completely lays into an unnamed R&D head at Intel for his comment that Deep Blue could only process, but Stanley thinks.

Both of these guys seem to enjoy bringing the more wild-eyed characters of the AI scene back down to earth.

So, who knows?

Nick
 
I'm sure PixyMisa would just point out how Winograd's assessment is of his own creation is 'wrong' and 'irrelevant' :rolleyes:
Not at all. It makes perfect sense, and is entirely congruent with what I am saying.

Consciousness is actually very simple, and Winograd demonstrated this.

Turns out that consciousness is not only not a "hard" problem, it's not much of a problem at all, and not, fundamentally, all that interesting.

The real issues are elsewhere - language, learning, and perception being three of the major fields of research since then.
 
I came across a piece by Terry Winograd, SHRDLU's creator, who seems less than optimistic...
Sure. That was written twenty years after SHRDLU, and nearly twenty years have passed since then, and progress has been grindingly slow.

Which doesn't change the fact that consciousness is a solved problem. Indeed, the great majority of the interesting stuff going on in our brains takes place at the subconscious level. How do you recognise your mother? How do you recognise your own left hand? These are not conscious processes.
 
I don't know anywhere near enough about AI to make an assessment. But what I see is that people like Winograd and Hofstadter are actually quite critical of the more ambitious statements of AI's potential.

For example, in I Am A Strange Loop (great book) Hofstadter completely lays into an unnamed R&D head at Intel for his comment that Deep Blue could only process, but Stanley thinks.

Both of these guys seem to enjoy bringing the more wild-eyed characters of the AI scene back down to earth.

So, who knows?

Nick

Well, it's good to know that there are so many levelheaded people in the field :)
 
The real issues are elsewhere - language, learning, and perception being three of the major fields of research since then.

Like I've said, thus far you've been using such a loose definition of the term 'consciousness' that it essentially has no meaning. If you employ such a broad definition of consciousness then ofcourse the 'problem' disappears and you can easily hand-wave the more pertinent questions away. When most other people speak of consciousness they are referring to the capacity for perception, and when philosophers speak of 'quale' they are referring to the qualitative nature of perception. Capacities like language and learning, in and of themselves, do not pose any significant philosophical or scientific problems. Its question of the nature and cause of perception that lies at the root of the issue.
 
Neither are they computers. How interesting.



And a bacterium would still behave differently from a mush of random biochemicals. How interesting.



How interesting. This invalidates the notion of computing how, exactly?



So does the definition of a crystal, you genius.

Did you miss the part of my definition where I said "effectively means according to an arbitrary threshold?"

There is no hard threshold for a crystal. Some crystals are much more ordered than others. The way we determine the threshold for a crystal is the point at which the system displays a non-linear behavior difference. It is different for every system. You don't know this?

I didn't define that threshold for computing, because I don't need to. All I have to say is that it exists -- and it does. One can put it anywhere they want. You can say a bowl of soup computes if you want to put the threshold very low. I could call liquid water crystalline if I put the threshold very low. What you cannot say is that a bowl of soup computes just as well as a computer, because my definition prevents it. Just like you cannot say that the structure of liquid water is as ordered as that of ice. And at some point, there is a non-linear behavior difference in the ability of a system to categorize input -- it begins to compute. At some point, there is a non-linear change in the order of a system -- it crystallizes.

When you connect electronic components properly, there is a huge non-linear difference in the system behavior.

You keep talking about the fact that isolated components can compute -- so what? Isolated portions of droplet can crystallize. Who cares? Do you even know what a "system" is, westprog? Are you familiar with the notion of a "system?"

Thanks RD! I think I actually learned something from this post! I hadn't been entirely sure what was being argued about for about 20 pages now, but I think I've got your position now.

Every thing is in levels or degrees. A useful definition accounts for these degrees. A definition of running that only allowed for something to be called 'running' if the organism was bipedal and moving at precisely 10mph would be useless. We would have to come up with another term for every other creature and every other speed and every other combo thereof. There are degrees of running.

A definition of a computer mouse that only applied to specific designs would also be useless. People would be saying things like 'This doesn't have buttons on the sides, it's not a mouse, it's something else.' or 'This one doesn't have a wheel, its not a mouse.' there are degrees of mouse-ness.

In the same way, a definition of consciousness that only applies to a specific level of complexity is causing confusion. Is a mentally disabled person conscious? A dog? A beetle? Should we come up with new words to describe the behavior of these creatures?

Any definition of ANYTHING can be taken down the way people are trying to take down your definition of consciousness. If I define a computer mouse as 'that which controls the pointer on a computer monitor.' Then someone could easily crap on that definition by saying 'Well, what about the pad on my laptop, is that a mouse?' Or 'What if I unplug it, does it stop being a computer mouse?' And the more you adjust for things like this, the more cumbersome and useless the definition becomes.

This is because there is no hard line between what is a computer mouse and what is not, or what is conscious and what is not. And as soon as you draw that line, there are suddenly all these things that are somewhat conscious or somewhat mouse-like that are outside of the line. What do you call these things? Do you make up another term and draw another line?

Computer mice and consciousness are concepts. Ideas. They are words used to describe how objects behave. The reason we have words, language, is to communicate these ideas to each other.

Your computer mouse is still a computer mouse, even though it is not like my computer mouse. We have other words for describing those differentiating features.

A rabbit is conscious, even though it is not conscious like a human is. We have other words for those differentiating features.

Trying to whittle down to what the core or essence of consciousness really is. Is just as pointless as trying to whittle down to what the core or essence of being a computer mouse really is.

The more you dig into any definition, the less sense it will make.
 
Thanks RD! I think I actually learned something from this post! I hadn't been entirely sure what was being argued about for about 20 pages now, but I think I've got your position now.

...snip...

The more you dig into any definition, the less sense it will make.

Yeah.

The take-home point is that as far as we can tell, the entire universe is nothing but fundamental particles and to the extent that X is different from Y it is only because of the behavior of the systems of particles that make up X or Y.

That is why creating artificial qualitative differences between categories of these behaviors -- crystallization versus computation, for example -- is a fallacy. It isn't that these aren't different, it is just that the difference is only a question of behavior rather than something else... whatever else... people seem to think.

At a fundamental level crystallization is just a certain category of behavior of systems of particles. So is computation. So is consciousness. So is human art and literature.

The fact that crystallization is easy to describe, while computation is more complex, and consciousness still more complex, and art and literature among the most complex of all, has nothing to do with whether they are all the same kind of thing.

And they are all the same kind of thing, fundamentally -- the behavior of systems of particles.
 
Thanks RD! I think I actually learned something from this post! I hadn't been entirely sure what was being argued about for about 20 pages now, but I think I've got your position now.

Every thing is in levels or degrees. A useful definition accounts for these degrees. A definition of running that only allowed for something to be called 'running' if the organism was bipedal and moving at precisely 10mph would be useless. We would have to come up with another term for every other creature and every other speed and every other combo thereof. There are degrees of running.

A definition of a computer mouse that only applied to specific designs would also be useless. People would be saying things like 'This doesn't have buttons on the sides, it's not a mouse, it's something else.' or 'This one doesn't have a wheel, its not a mouse.' there are degrees of mouse-ness.

In the same way, a definition of consciousness that only applies to a specific level of complexity is causing confusion. Is a mentally disabled person conscious? A dog? A beetle? Should we come up with new words to describe the behavior of these creatures?

Any definition of ANYTHING can be taken down the way people are trying to take down your definition of consciousness. If I define a computer mouse as 'that which controls the pointer on a computer monitor.' Then someone could easily crap on that definition by saying 'Well, what about the pad on my laptop, is that a mouse?' Or 'What if I unplug it, does it stop being a computer mouse?' And the more you adjust for things like this, the more cumbersome and useless the definition becomes.

This is because there is no hard line between what is a computer mouse and what is not, or what is conscious and what is not. And as soon as you draw that line, there are suddenly all these things that are somewhat conscious or somewhat mouse-like that are outside of the line. What do you call these things? Do you make up another term and draw another line?

Computer mice and consciousness are concepts. Ideas. They are words used to describe how objects behave. The reason we have words, language, is to communicate these ideas to each other.

Your computer mouse is still a computer mouse, even though it is not like my computer mouse. We have other words for describing those differentiating features.

A rabbit is conscious, even though it is not conscious like a human is. We have other words for those differentiating features.

Trying to whittle down to what the core or essence of consciousness really is. Is just as pointless as trying to whittle down to what the core or essence of being a computer mouse really is.

The more you dig into any definition, the less sense it will make.


OOOOH

Maybe they will call you a Dirty Eliminativist as well!

When says something like 'it is blazingly obvious I am conscious', then you usually get a really vague answer as to why.

Then you get "You are stupid for not just agreeing with me."

Or "I can't define it." or "I will use vaguer terms to get to my vaguer meaning."

My favorite
"Scientists are not spending enough time studying that which I can't define."

Great post. Semantic meaning is always idiomatic and self referential.
 
Last edited:

Back
Top Bottom