Has consciousness been fully explained?

Status
Not open for further replies.
Uh huh, nice retort. And as stated before, many things are conflated under consciousness, and by you now under SOFIA, yu have not deined it, you have not explored it, you have not tried to deliniate it. Yet you insist that it exist. SOFIA is not a prerequisite for hallucination, perception is. So you have just conflated perception into SOFIA.

Just as whale is a fish.

I'm sorry, but I can't parse any of that.
 
I think what David is saying, is that you haven't defined "sofia".

When I say Elanor caught a frog last night and you say "Who's Elanor" and I point to a 14 lb. black cat on the back of the couch, you now know who Elanor is.

Are you really going to claim that you have no idea what I'm talking about when I mention a "sense of felt individual awareness" which stops when you fall asleep, starts again when you dream, stops again when the dream's over, and starts again when you wake up?
 
Are you really going to claim that you have no idea what I'm talking about when I mention a "sense of felt individual awareness" which stops when you fall asleep, starts again when you dream, stops again when the dream's over, and starts again when you wake up?
I am going to claim that, yes. At best I have a vague feeling that I might have something like that, but until you define it in a scientifically meaningful way I can't really be sure.

I can't parse that either. I'm sorry but I have no idea what you're trying to say.
Looks you can't make sense of Sofia either.
 
I can't parse that either. I'm sorry but I have no idea what you're trying to say.

You have defined SOFIA so far to be a high level abstraction, I am saying that the perceptions, which as it were are models created from the sensations, are the basis of hallucinations. In that a person who is experiencing hallucinations is having a generation of perceptions that are spurious, or false to the sensory input. (They are different than dreaming as well, which is similar but different.)

So are you saying that SOFIA is needed to create hallucinations?

In your statement
"Yes, we can have hallucinations, but we cannot hallucinate the experience of Sofia itself, because it is a prerequisite to the experience of any hallucination.", what mechanism are you suggesting that SOFIA is a prerequisite for hallucination.

I am saying that perception is a prerequisite (and in fact the likely mechanical basis of hallucination), you are saying SOFIA, I am asking for an explanation of how that speculatively might work.

So I understand your statement that we can not hallucinate the SOFIA (I will not drag in delusions at this point), but I am asking how is SOFIA a prerequisite for hallucination?
 
Last edited:
But we could be a model in a very large bottle somewhere in some larger-dimensional universe. It would be all the same to us.

Oh, that is what I think of when I see the word "simulation."

I wasn't aware of a special distinction you were making between a model and a simulation. Sorry!
 
Similarly, no matter how closely we perform a simulation of the brain, we'll never make any of those signature waves move through OPR, for example, because the thing we call the simulation is an abstraction. No actual brain is operating in OPR, which is where the phenomenon of consciousness actually occurs, just as the phenomenon of spilling oil only actually occurs in OPR, even though they are different kinds of phenomena.

Your definitons aren't complete, though.

You can only say that OPR spilling OPR oil only ocurs in OPR.

That is rather obvious. So what? We can still say that oil spills -- it just isn't in OPR. Why is there a fixation with OPR?

I want you to tell me why, for instance, the navigation mechanics used by the A.I. I work with are any less "real" than the navigaiton mechanics used by a robot in OPR. Aren't they the same algorithms, operating on the same data? What difference does it make if the data comes from another module that interprets camera images <robot> or from raw data strutures stored as part of a simulated world < my A.I.> ? In the end, it is the same kind of decisions being made -- find the shortest path between two points of data in a data structure that models the world the entity inhabits.

And is that not what we do? Do our brains not also operate on data from the model of the outside world we hold in our heads?
 
I am going to claim that, yes. At best I have a vague feeling that I might have something like that, but until you define it in a scientifically meaningful way I can't really be sure.

Then we have nothing to talk about.
 
You have defined SOFIA so far to be a high level abstraction

Absolutely not. I've said very clearly that it's a function of the body.

I mean, you might want to claim that "blood pressure" is an abstraction, but the only thing that's an abstraction about it is the number we give it -- the pressure is perfectly real.

So is Sofia.
 
So are you saying that SOFIA is needed to create hallucinations?

If you want to put it that way, yes, it's needed if you're going to have a hallucination, because Sofia has to be running, so to speak, for there to be any kind of hallucination, as I understand the word.

But it doesn't create hallucinations.
 
In your statement
"Yes, we can have hallucinations, but we cannot hallucinate the experience of Sofia itself, because it is a prerequisite to the experience of any hallucination.", what mechanism are you suggesting that SOFIA is a prerequisite for hallucination.

I am saying that perception is a prerequisite (and in fact the likely mechanical basis of hallucination), you are saying SOFIA, I am asking for an explanation of how that speculatively might work.

So I understand your statement that we can not hallucinate the SOFIA (I will not drag in delusions at this point), but I am asking how is SOFIA a prerequisite for hallucination?

To take the last question first, if you have no experiential sense of you and of some sort of external reality (actual or imagined), then no, you can't possibly be "experiencing" anything, by definition.

You gotta be experiencing something to experience a hallucination.

And if you're aware of experiencing something, that is by definition a Sofia event.

When you say "perception is a prerequisite", it's hard to say that's not true, but I'm not certain exactly what you mean.

There can be more than one prereq for things, of course.

Anyway, we don't have to know the mechanism -- which is fortunate, because no one does yet -- that generates Sofia to know that there can't be any hallucinations without one.

The brain can be mistaken about things without being consciously aware of them, but it can't hallucinate anything without some form of Sofia.
 
Your definitons aren't complete, though.

You can only say that OPR spilling OPR oil only ocurs in OPR.

That is rather obvious. So what? We can still say that oil spills -- it just isn't in OPR. Why is there a fixation with OPR?

You can't be seriously asking this question.
 
I want you to tell me why, for instance, the navigation mechanics used by the A.I. I work with are any less "real" than the navigaiton mechanics used by a robot in OPR. Aren't they the same algorithms, operating on the same data? What difference does it make if the data comes from another module that interprets camera images <robot> or from raw data strutures stored as part of a simulated world < my A.I.> ? In the end, it is the same kind of decisions being made -- find the shortest path between two points of data in a data structure that models the world the entity inhabits.

And is that not what we do? Do our brains not also operate on data from the model of the outside world we hold in our heads?

No, they don't. Our brains don't hold any data or any models. They are habit machines, specially designed association machines.

There's as much data in our brains as there is in our muscles.

The algorhythms for the navigational mechanics, well, those are "real" as long as there's someone alive to think about them.

But if you think any sort of simulated anything is as real as things you bang up against in reality... I'm not sure what to say about that.
 
If you want to put it that way, yes, it's needed if you're going to have a hallucination, because Sofia has to be running, so to speak, for there to be any kind of hallucination, as I understand the word.

But it doesn't create hallucinations.

I am trying to understand your use of SOFIA, so for there to be perceptions there has to be SOFIA, or are they paralell?

So is it like a process underlying perceptions or a side by side process?
 
To take the last question first, if you have no experiential sense of you and of some sort of external reality (actual or imagined), then no, you can't possibly be "experiencing" anything, by definition.

You gotta be experiencing something to experience a hallucination.

And if you're aware of experiencing something, that is by definition a Sofia event.

When you say "perception is a prerequisite", it's hard to say that's not true, but I'm not certain exactly what you mean.

There can be more than one prereq for things, of course.

Anyway, we don't have to know the mechanism -- which is fortunate, because no one does yet -- that generates Sofia to know that there can't be any hallucinations without one.

The brain can be mistaken about things without being consciously aware of them, but it can't hallucinate anything without some form of Sofia.

Fair enough sounds like a parallel process then.
 
Status
Not open for further replies.

Back
Top Bottom