• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Are You Conscious?

Are you concious?

  • Of course, what a stupid question

    Votes: 89 61.8%
  • Maybe

    Votes: 40 27.8%
  • No

    Votes: 15 10.4%

  • Total voters
    144
I know not how to defend something you, Belz, Robin and dozens of others who must have voted No in the poll lack; who find 'subjective self' as meaningless as 'Flibbety Flobbety'.

I suspect 'stream of conscious' is equally meaningless for you as well?

That is funny, this is the JREF, you use, the burden is on you, great posturing however.

The question still remains, define your term.

Otherwise we are still left with the claim of 'inner child' to deal with.

ETA: Your assumptions are just spin, I voted in the maybe category as the defintion was not presented. Under the moedical defintion, yes, but there are a couple of times I wonder about.
 
Last edited:
OK then. You asked for it.

Stream of Consciousness. Kerowakian. The ongoing data dump of thoughts that come and, unbidden, go.

Stream of Conscious. Thoughts that result in public behavior.

Subjective self. The whatever-it-is that sometimes decides to move a thought from the Stream of Consciousness to the objective, 3rd party observable, Stream of Conscious. I'd presume these decisions involve application of the 'frame' that for AI is the Frame Problem.

I suggest that nearly all public behavior is the body running on autopilot, so to speak, does not require subjective-self review, and for the most part does not or even cannot become a thought available for such review.

And Pixy will of course tell us he's written programs that do just that. And I'd reply, perhaps, but where is the subjective self, and where is the stream of consciousness in the program?
 
It is such a weird discussion because Albell either can't or won't see the flaw in what he is saying.

1. He introduces term X
2. He says he cannot define term X
3. He says that I must either define term the X or deny that I have X.

The third line is a self-contradiction. He is saying that if I don't know what X means then I must deny that I have X.

But how can I deny that I have X until I know what it means?
 
Robin, you must be working on definitions for consciousness and awareness of sufficient clarity science can analyze them fully, so we will have no more threads of dozens of pages discussing them.
 
OK then. You asked for it.

Stream of Consciousness. Kerowakian. The ongoing data dump of thoughts that come and, unbidden, go.

I'm pretty sure that Robin didn't ask for word salad, but I could check his receipt.

Stream of Conscious. Thoughts that result in public behavior.

What about thoughts that DON'T result in public behaviour ?

Subjective self. The whatever-it-is that sometimes decides to move a thought from the Stream of Consciousness to the objective, 3rd party observable, Stream of Conscious. I'd presume these decisions involve application of the 'frame' that for AI is the Frame Problem.

The what that sometimes decides to move a thought from the what to huh ?
 
OK then. You asked for it.

Stream of Consciousness. Kerowakian. The ongoing data dump of thoughts that come and, unbidden, go.
Behavioural definition
Stream of Conscious. Thoughts that result in public behavior.
Behavioural definition
Subjective self. The whatever-it-is that sometimes decides to move a thought from the Stream of Consciousness to the objective, 3rd party observable, Stream of Conscious. I'd presume these decisions involve application of the 'frame' that for AI is the Frame Problem.
Stipulative definition in the general case and ostensive in a particular case.
I suggest that nearly all public behavior is the body running on autopilot, so to speak, does not require subjective-self review, and for the most part does not or even cannot become a thought available for such review.

And Pixy will of course tell us he's written programs that do just that. And I'd reply, perhaps, but where is the subjective self, and where is the stream of consciousness in the program?
Well clearly it all depends upon what "thought" means and what "decide" means.

Also, if "stream of consciousness" is a data dump - where is it being dumped?

If we consider a "thought" as data (as you have) and for consistency also use "decide" in an algorithmic sense * then clearly there is no problem - the computer together with the program will satisfy your definition of "subjective self" easily.

And the process of information manipulation happening in the computer will satisfy your definition of "stream of consciousness".

I don't know if Pixy would claim to have solved the frame problem but would you regard a solution to the frame problem as proof that a computer was conscious?


* And before anybody comments, I think "decide" only has an algorithmic sense, but acknowledge that many people would say that it has another sense.
 
Last edited:
Robin said:
OK then. You asked for it.

Stream of Consciousness. Kerowakian. The ongoing data dump of thoughts that come and, unbidden, go.
Behavioural definition
Subjective and private, perhaps.

Stream of Conscious. Thoughts that result in public behavior.
Behavioural definition
Subjective self. The whatever-it-is that sometimes decides to move a thought from the Stream of Consciousness to the objective, 3rd party observable, Stream of Conscious. I'd presume these decisions involve application of the 'frame' that for AI is the Frame Problem.
Stipulative definition in the general case and ostensive in a particular case.
And we appear to be making progress.


I suggest that nearly all public behavior is the body running on autopilot, so to speak, does not require subjective-self review, and for the most part does not or even cannot become a thought available for such review.

And Pixy will of course tell us he's written programs that do just that. And I'd reply, perhaps, but where is the subjective self, and where is the stream of consciousness in the program?
Well clearly it all depends upon what "thought" means and what "decide" means.
Yes, it does. Apparently we can struggle, perhaps even forward, even with that uncertainty.

Also, if "stream of consciousness" is a data dump - where is it being dumped?
Back where it came from, maybe. Or it just dissapates into nothingness. Perhaps you have a better way to express it.

As an even further aside, does the process I described match what you find in your own subjective view of 'what-is-happening'?

If we consider a "thought" as data (as you have) and for consistency also use "decide" in an algorithmic sense * then clearly there is no problem - the computer together with the program will satisfy your definition of "subjective self" easily.
Thought is a pattern of neurons firing, apparently; ergo, data. Why they fire in a way that provides the thought-by-thought stream is another problem. My subjective self doesn't seem to be involved, other than as a monitor, "deciding" to react, or not.

And the process of information manipulation happening in the computer will satisfy your definition of "stream of consciousness".
Perhaps. Why a programmer would decide to program in a way that what appear to be random thoughts(data) are presented for handling, or to be ignored, is another problem.

I don't know if Pixy would claim to have solved the frame problem but would you regard a solution to the frame problem as proof that a computer was conscious?
I don't foresee any solution to the frame problem if examined as a software challenge.

* And before anybody comments, I think "decide" only has an algorithmic sense, but acknowledge that many people would say that it has another sense.
As we now discuss how many angels dance on the head of libertarian free-will.

I do appreciate your thoughtful response. :)
 
OK then. You asked for it.

Stream of Consciousness. Kerowakian. The ongoing data dump of thoughts that come and, unbidden, go.
Okay, so a recording of thoughts.
Stream of Conscious. Thoughts that result in public behavior.
More thoughts.
Subjective self. The whatever-it-is that sometimes decides to move a thought from the Stream of Consciousness to the objective, 3rd party observable, Stream of Conscious. I'd presume these decisions involve application of the 'frame' that for AI is the Frame Problem.
Muddle, assumption of volition. Vague terminology.
I suggest that nearly all public behavior is the body running on autopilot, so to speak, does not require subjective-self review, and for the most part does not or even cannot become a thought available for such review.

And Pixy will of course tell us he's written programs that do just that. And I'd reply, perhaps, but where is the subjective self, and where is the stream of consciousness in the program?


See here you are, using vague terms and pretending they mean something.
 
Robin, you must be working on definitions for consciousness and awareness of sufficient clarity science can analyze them fully, so we will have no more threads of dozens of pages discussing them.

Shift burden much, you used a vague term and then say we have to define it for you.

There are many pages (maybe even books) written on the medical use of the term 'consciounses'.

You aon the other hand are suggesting something you can't define,e xplain or even present so far. Kind of makes it hard to discuss, now doesn't it.
 
As we now discuss how many angels dance on the head of libertarian free-will.
Which is pretty much what I am talking about - imprecise and/or circular definitions lead, after a very few steps, to angels on heads of pins.

Mathematics can avoid this by makings it's definitions pedantically precise.

Science can avoid this by testing it's conclusions against empirical evidence.

But any other kind of language game (as UE would say) will eventually end up counting infinitessimal cherubim.
 
But any other kind of language game (as UE would say) will eventually end up counting infinitessimal cherubim.
Agreed.

Yet, I'll ask again if you would address, using the imprecision of language since that's what is actually available, my question from #1948.

"As an even further aside, does the process I described match what you find in your own subjective view of 'what-is-happening'?"

Even if not, thanks for your previous comments.
 
Which is pretty much what I am talking about - imprecise and/or circular definitions lead, after a very few steps, to angels on heads of pins.

Mathematics can avoid this by makings it's definitions pedantically precise.

Science can avoid this by testing it's conclusions against empirical evidence.

But any other kind of language game (as UE would say) will eventually end up counting infinitessimal cherubim.

And should we refuse to use language where it is not mathematical or scientific? No, because that would mean an inability to communicate. Because we do communicate, even where our definitions collapse.
 
Agreed.

Yet, I'll ask again if you would address, using the imprecision of language since that's what is actually available, my question from #1948.

"As an even further aside, does the process I described match what you find in your own subjective view of 'what-is-happening'?"

Even if not, thanks for your previous comments.
The problem is that it doesn't seem a sufficient description - it doesn't seem to capture that which needs to be explained about consciousness.

And that is the problem I have been referring to since near the beginning of this thread.

Many of us feel that consciousness needs an explanation beyond mere function. But nobody can put into words just what it is that needs explaining.

This could mean that our puzzlement is spurious, or maybe it means we have not yet managed to frame the problem properly.

But I sense that many people are more interested in there being a mystery than they are in clarifying the problem (if any) or finding answers. (Chalmers for example).
 
And should we refuse to use language where it is not mathematical or scientific? No, because that would mean an inability to communicate. Because we do communicate, even where our definitions collapse.
No, but on the other hand we shouldn't pretend that the problem doesn't exist.
 
The problem is that it doesn't seem a sufficient description - it doesn't seem to capture that which needs to be explained about consciousness.
Sorry. I certainly didn't intend my admitted aside as explaining consciousness; rather a description of what seems to be going on in my mind.

And that is the problem I have been referring to since near the beginning of this thread.

Many of us feel that consciousness needs an explanation beyond mere function. But nobody can put into words just what it is that needs explaining.
Hence my 'subjective self' and other comments as a feeble attempt. It's strictly a flight-of-fancy.

This could mean that our puzzlement is spurious, or maybe it means we have not yet managed to frame the problem properly.
Still, words are all we have available.

But I sense that many people are more interested in there being a mystery than they are in clarifying the problem (if any) or finding answers. (Chalmers for example).
Your hoped for mathematical clarity doesn't exist, and none of us know if it will ever exist, mystery or not.
 
Sorry. I certainly didn't intend my admitted aside as explaining consciousness; rather a description of what seems to be going on in my mind.
I didn't think you intended it to explain consciousness.

But what I am saying is that it does not even identify what it is that needs to be explained.

And that is not a criticism, because nobody who feels that there is something beyond function that needs to be explained - including me - can put into words what exactly needs to be explained.
 
Sorry. I certainly didn't intend my admitted aside as explaining consciousness; rather a description of what seems to be going on in my mind.


Hence my 'subjective self' and other comments as a feeble attempt. It's strictly a flight-of-fancy.


Still, words are all we have available.


Your hoped for mathematical clarity doesn't exist, and none of us know if it will ever exist, mystery or not.


See this is all very cool, you are describing as opposed to just using and then making assumptions based upon those assertions.

My issue is always the loose way the term consciousness gets used.
 
My claim is that the problem is probably fundamental and not resolvable by some need form of words.
Then why are we talking about it at all?

If we can discuss it then we can at least examine the words we use and check if they have a meaning.

I think you are doing the same thing that AlBell is doing - you use the word "understand" and then assume that you know what it refers to and assume that everybody shares the associations you have with this word.

You reject my definition because you think it does not cover what you think you mean by the term, and yet you cannot put into words what you mean by the term.

So in the end I mean something by the word and you mean something quite different. You know what I mean by the term but I have no way of knowing what you mean by it.

So we end up talking past each other. One resolution would be that you could accept that you don't know what you mean by the word "understand".
 
Last edited:

Back
Top Bottom