That's pretty vague. My computer can access my printer but I don't think that's what we mean by "consciousness". The computer is aware of the printer, but is not conscious as a result. Self-awareness, however...
Feelings are thoughts, which are actions. So, fail.
Once you start talking about self-awareness at a neuronal level of investigation you have to define self at a neuronal level. Fancy that one?
Thoughts are often termed "inner speech" in cognitive neuroscience. See, for example, Baars - In the Global Workspace of Consciousness. Feelings are different.
How different ? Feelings are actions, thoughts are actions. So again, fail. What is there aside from stuff and the stuff that stuff does ?
I'm not sure we're using the same definition of "self", here. By "self", I mean the thing having the awareness, not some cartesian theatre observer.
How different ? Feelings are actions, thoughts are actions. So again, fail. What is there aside from stuff and the stuff that stuff does ?
And, at a neuronal level, the thing having the awareness is...
The whole universe may be considered actions. So what?
Simply calling thoughts and emotions 'actions' doesn't tell us anything useful other than saying "they happen".
Neurons aren't self-aware, unless I'm mistaken, so nothing.
So nothing. You've just answered my question. Therefore "I" is not anything else than actions. You claimed otherwise.
I think so.
I haven't seen a precise definition of this usage of "think", but yes, I'd say that SHRDLU thinks.
Terry Winograd said:Artificial intelligence researchers predict that _thinking machines_
will take over our mental work, just as their mechanical predecessors
were intended to eliminate physical drudgery. Critics have argued with
equal fervor that _thinking machine_ is a contradiction in terms.
Computers, with their foundations of cold logic, can never be creative
or insightful or possess real judgment. Although my own understanding
developed through active participation in artificial intelligence
research, I have now come to recognize a larger grain of truth in the
criticisms than in the enthusiastic predictions. The source of the
difficulties will not be found in the details of silicon micro-circuits
or of Boolean logic, but in a basic philosophy of patchwork rationalism
that has guided the research. In this paper I review the guiding
principles of artificial intelligence and argue that as now conceived it
is limited to a very particular kind of intelligence: one that can
usefully be likened to bureaucracy. Winograd (1991)
PixyMisa said:I haven't seen a precise definition of this usage of "think", but yes, I'd say that SHRDLU thinks.
I came across a piece by Terry Winograd, SHRDLU's creator, who seems less than optimistic...
Terry Winograd said:Artificial intelligence researchers predict that _thinking machines_
will take over our mental work, just as their mechanical predecessors
were intended to eliminate physical drudgery. Critics have argued with
equal fervor that _thinking machine_ is a contradiction in terms.
Computers, with their foundations of cold logic, can never be creative
or insightful or possess real judgment. Although my own understanding
developed through active participation in artificial intelligence
research, I have now come to recognize a larger grain of truth in the
criticisms than in the enthusiastic predictions. The source of the
difficulties will not be found in the details of silicon micro-circuits
or of Boolean logic, but in a basic philosophy of patchwork rationalism
that has guided the research. In this paper I review the guiding
principles of artificial intelligence and argue that as now conceived it
is limited to a very particular kind of intelligence: one that can
usefully be likened to bureaucracy. Winograd (1991)
Nick
So, when you're discussing the neuronal level of processing, defining consciousness as self-awareness is meaningless. Agreed?
Belz, a couple of posts dialoguing with you and death loses its sting.
I'm sure PixyMisa would just point out how Winograd's assessment is of his own creation is 'wrong' and 'irrelevant'![]()
Not at all. It makes perfect sense, and is entirely congruent with what I am saying.I'm sure PixyMisa would just point out how Winograd's assessment is of his own creation is 'wrong' and 'irrelevant'![]()
Sure. That was written twenty years after SHRDLU, and nearly twenty years have passed since then, and progress has been grindingly slow.I came across a piece by Terry Winograd, SHRDLU's creator, who seems less than optimistic...
I don't know anywhere near enough about AI to make an assessment. But what I see is that people like Winograd and Hofstadter are actually quite critical of the more ambitious statements of AI's potential.
For example, in I Am A Strange Loop (great book) Hofstadter completely lays into an unnamed R&D head at Intel for his comment that Deep Blue could only process, but Stanley thinks.
Both of these guys seem to enjoy bringing the more wild-eyed characters of the AI scene back down to earth.
So, who knows?
Nick
The real issues are elsewhere - language, learning, and perception being three of the major fields of research since then.
Neither are they computers. How interesting.
And a bacterium would still behave differently from a mush of random biochemicals. How interesting.
How interesting. This invalidates the notion of computing how, exactly?
So does the definition of a crystal, you genius.
Did you miss the part of my definition where I said "effectively means according to an arbitrary threshold?"
There is no hard threshold for a crystal. Some crystals are much more ordered than others. The way we determine the threshold for a crystal is the point at which the system displays a non-linear behavior difference. It is different for every system. You don't know this?
I didn't define that threshold for computing, because I don't need to. All I have to say is that it exists -- and it does. One can put it anywhere they want. You can say a bowl of soup computes if you want to put the threshold very low. I could call liquid water crystalline if I put the threshold very low. What you cannot say is that a bowl of soup computes just as well as a computer, because my definition prevents it. Just like you cannot say that the structure of liquid water is as ordered as that of ice. And at some point, there is a non-linear behavior difference in the ability of a system to categorize input -- it begins to compute. At some point, there is a non-linear change in the order of a system -- it crystallizes.
When you connect electronic components properly, there is a huge non-linear difference in the system behavior.
You keep talking about the fact that isolated components can compute -- so what? Isolated portions of droplet can crystallize. Who cares? Do you even know what a "system" is, westprog? Are you familiar with the notion of a "system?"
Thanks RD! I think I actually learned something from this post! I hadn't been entirely sure what was being argued about for about 20 pages now, but I think I've got your position now.
...snip...
The more you dig into any definition, the less sense it will make.
Thanks RD! I think I actually learned something from this post! I hadn't been entirely sure what was being argued about for about 20 pages now, but I think I've got your position now.
Every thing is in levels or degrees. A useful definition accounts for these degrees. A definition of running that only allowed for something to be called 'running' if the organism was bipedal and moving at precisely 10mph would be useless. We would have to come up with another term for every other creature and every other speed and every other combo thereof. There are degrees of running.
A definition of a computer mouse that only applied to specific designs would also be useless. People would be saying things like 'This doesn't have buttons on the sides, it's not a mouse, it's something else.' or 'This one doesn't have a wheel, its not a mouse.' there are degrees of mouse-ness.
In the same way, a definition of consciousness that only applies to a specific level of complexity is causing confusion. Is a mentally disabled person conscious? A dog? A beetle? Should we come up with new words to describe the behavior of these creatures?
Any definition of ANYTHING can be taken down the way people are trying to take down your definition of consciousness. If I define a computer mouse as 'that which controls the pointer on a computer monitor.' Then someone could easily crap on that definition by saying 'Well, what about the pad on my laptop, is that a mouse?' Or 'What if I unplug it, does it stop being a computer mouse?' And the more you adjust for things like this, the more cumbersome and useless the definition becomes.
This is because there is no hard line between what is a computer mouse and what is not, or what is conscious and what is not. And as soon as you draw that line, there are suddenly all these things that are somewhat conscious or somewhat mouse-like that are outside of the line. What do you call these things? Do you make up another term and draw another line?
Computer mice and consciousness are concepts. Ideas. They are words used to describe how objects behave. The reason we have words, language, is to communicate these ideas to each other.
Your computer mouse is still a computer mouse, even though it is not like my computer mouse. We have other words for describing those differentiating features.
A rabbit is conscious, even though it is not conscious like a human is. We have other words for those differentiating features.
Trying to whittle down to what the core or essence of consciousness really is. Is just as pointless as trying to whittle down to what the core or essence of being a computer mouse really is.
The more you dig into any definition, the less sense it will make.