• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
Oh, I understand fine. I'm just wondering if Malerin does. Or you, for that matter.

And at least your inane ad hominem was appropriate to the topic for once. That's progress, of a sort.

Yes, I'm beginning to see the light, PixyMisa. I'm sure that with enough effort our paltry little minds will one day reach your level of keen intellect and understanding :)
 
Right, how silly of me. I forgot such queries were beyond your capacity to answer.
I should answer for Malerin now?

Quite honestly, I don't know how Malerin defines anything relating to the mind, partly because nothing he says on the subject seems correlated with reality, and partly because he steadfastly refuses to define anything whatsoever.

Or you, for that matter.

Thats interesting. Tell me more.
 
Last edited:
What is that supposed to mean?


And after a great deal of coaxing, FUWF conceded that this meant that he considered my definition overbroad, and offered for consideration some qualifiers, which was a valid and interesting discussion, though he still resorted too often to ad hominems in place of reason. Then he left.


Which is, of course, neither correct nor in any way relevant.


I didn't ignore it, I pointed out that it was irrelevant.


Well, no.


Nope. You still don't grasp what self-reference is.


Reductio ad fail, I'm afraid.




How is that memorable? I don't recall exctly how FUWF redefined "qualia", but his misunderstanding of Dennett's position - and mine - is complete.


Yes, as I said, he all too often resorted to ad hominem attacks (and arguments from authority) in place of reason.

Which is, of course, entirely unrelated to anything I have ever said.

Right. Now this goes to the heart of the discussion with FUWF. He says that consciousness is self-referential information processing, but with additional qualifiers. In other words, we mostly agreed, and he entirely disagrees with you, and AkuManiMani, and the rest of you.

He had a semantic difference. He decided, rather than discuss this difference reasonably, to devote much of his time to irrelevant and absurd ad hominems.


Again, did not address anything I ever said.

Again, he doesn't bother to address the point, just resorts to ad hominems. The conclusion that there are multiple distinct consciousnesses running simultaneously within your brain is a very clear one given the examination of brain function and failure modes. Blindsight is one good example; split brain patients the most striking.

His key failure here is that he doesn't even attempt to address the point. He did that a lot, which is why the debate mostly went nowhere.


Again, this is not addressing anything I've ever said.


And took him off when he decided to actually discuss matters. And then he left.


He made a few rational points. You have quoted only one of them - the rest being utter failure - and even in that he couldn't restrain himself to discussing the subject.

Oh, and just remember, through all that abuse and illogic, he was agreeing with me.

He specifically did NOT say that.

I want to clarify things a bit because my response above sounds like I agree with Pixi who says consciousness can be defined as an SRIP. While i agree with Pixi's conclusion I still disagree we can use this as a definition. As I said before, the definition I would be comfortable with is that Consciousness utilizes a form of SRIP.

While I believe that SRIPs are necessary for consciousness, they may not be sufficient.


And later....

Oh my. There is the over-broard equivalence and qualification error you're making again. It is NOT correct. It would be correct to say "consciousnes is a form of computation", however. Big difference in qualification though - see it yet?

Here's Pixi's definition again:

"Consciousness is SRIP"

Colloquially, though I might argue it's still sloppy, I'm fine with it. Philosophically, it is untenable due to at least 2 problems I will explain.

Problems:

1. Too broad >>> Qualification Error. There are many forms of SRIPs including some we probably don't even know about yet (perhaps including the specific form that may truly give rise to consciousness) With a definition like this you have to logically conclude that consciousness can and must encompass all of them. There is no evidence to support this so it cannot be a definition. It is still mere conjecture.
2. Equivalence error. The word "is" sets a VERY high bar of equivalence. SRIPs must be both necessary and sufficient for consciousness. Pixi makes great arguments that they are necessary. But he has never made one that they are sufficient. In fact, he essentially admits that in several posts. Furthermore, many of my explanations that Pixi appears to agree with for the nature of consciousness propose necessary features that may employ but may not require SRIPs or for which SRIPs are not the true essence (since SRIPs encompass the most basic forms of recursion they can be too trivial to claim as essence for equivalence). All I have to do is show there is some computational requirement for consciousness that is not a form of SRIP, OR does not have to be programmed using an SRIP, e.g., it could turn out our brain uses nothing but various forms of SRIPs but that we (or evolution on some other planet) could create an equivalent AI consciousness that is not completely dependent on SRIPs.




I agree with him. SRIP is a necessary condition for a conscious system. The problem you have is you go a step further and assert SRIP IS consciousness, meaning it's necessary AND sufficient. And of course that's what gets you into all sorts of trouble with conscious toasters and cars and whatnot.

There was one response from you that I thought spoke volumes:

You already accept my definition as a necessary component. You say it's not sufficient. Fine. What is?

And that would open up whole avenues of approaching to figuring out what consciousness is. If only you'd taken that additional step...
 
Last edited:
...
I agree with him. SRIP is a necessary condition for a conscious system.
...
There was one response from you that I thought spoke volumes:
Originally Posted by PixyMisa:
You already accept my definition as a necessary component. You say it's not sufficient. Fine. What is?
And that would open up whole avenues of approaching to figuring out what consciousness is. If only you'd taken that additional step...

So you both agree that SRIP is necessary for consciousness, but you (like me) feel it isn't sufficient. So why not address that question, what else is required?

It seems to me that we are clarifying what our definitions of consciousness encompass. Pixy has a minimal definition that is broader in scope than most of us are comfortable with, and has asked what more we think is required to satisfy us. For my part, I don't yet know what else is required to produce what I would be comfortable calling consciousness, but I suspect it involves a computational architecture that allows a level of general purpose creative adaptive behaviour (as seen in many biological organisms with a CNS). Whether insects are conscious by my preferred definition, I don't really know, but I doubt it; similarly for insect hive/swarm behaviour (and does SRIP apply here anyway?). My predilection may simply originate in the feeling that a consciousness entity ought to behave with less robotic predictability than shown by insects, but when I see the complexity and flexibility of behaviour that an ant swarm can display, it gives me pause.

It does seem that going beyond SRIP involves drawing an ill-defined and arbitrary personal line between what one feels is and what one feels is not consciousness - unless one can posit further well-defined functional requirements that will satisfy one's own preferred definition. I can't yet, but I think further study of biological brains will make it clearer, and perhaps I eventually will be able to specify what else is needed for my preferred (and at present arbitrary) definition of consciousness.

Can you posit any well-defined functional requirements, in addition to SRIP, that might satisfy your own preferred definition of consciousness?
 
I agree with him. SRIP is a necessary condition for a conscious system.

Erm, you probably don't agree with him like you think you do.

That's like an IDer saying they agree with Dawkins because both accept evolution is a factor in our history.

FUWF's position was that consciousness is a form of SRIP that has a whole bunch of other stuff augmenting it, none of which is magical. In other words, that the difference between an SRIP earthworm and an SRIP human is quantitative, not qualitative.

Is that also your position? I doubt it.
 
Some memorable quotes from FedUp:

What you fail to realize is that FUWF simply didn't like Pixy's distinction between general consciousness and human consciousness. For FUWF, consciousness == human consciousness.

That is all their arguments were about -- notice how there isn't a single disagreement between them regarding functionality or science. It is only about semantics.

In fact FUWF made it clear that his opposition to Pixy's arguments about qualia not existing was only that "qualia" is still a useful term even though qualia aren't what most non-computationalists think they are.

In this case, the enemy of your enemy is not your friend.
 
Erm, you probably don't agree with him like you think you do.

That's like an IDer saying they agree with Dawkins because both accept evolution is a factor in our history.

FUWF's position was that consciousness is a form of SRIP that has a whole bunch of other stuff augmenting it, none of which is magical. In other words, that the difference between an SRIP earthworm and an SRIP human is quantitative, not qualitative.
Right. And I actually had a worthwhile discussion with FUWF about that. As I recall, he posited three additional functions he thought were required for a system to be called conscious. I didn't entirely agree, but I did agree that his points were worth taking further. Unfortunately, he left the forum shortly after that.
 
My point is that the biological functioning of neurons is an integral part of their ability to generate consciousness and, subsequently, conscious behavior. I also pointed out earlier that the thermodynamic properties of living organisms seem to be a base requirement of supporting conscious activity. Since we do not understand the biophysics of how these cells accomplish this it’s both presumptuous and premature to make determinations as to what "emergent functionalities" are necessary. Attempts at creating artificial conscious systems that ignore these issues are doomed to failure.

Your mistake is to think *EVERY* single process in that cell is related to consciousness. Actually a lot of process are common to a lot of cell in the body, or even across eukaryote. You haven't explained why, for example, the DNA repair mechanism would be necessary for consciousness, it is after all present in all eukaryote cell , as far as I can tell, or why the protein translation process would be necessary. To give another example, the cell repair process/protein making process is not specific to liver cell (as far as i remember) and is not necessary to explain its function. What is important is to know what additional function those cells do (and indeed very complex and differents functions). Same for say pancreas or muscle cell. Why would it be different for a neuron ?

I also pointed out earlier that the thermodynamic properties of living organisms seem to be a base requirement of supporting conscious activity.

I am sorry but i would like to require evidence of that. *ALL* we have up to date is that we know only of only 1 type of being "conscious" and that is animals, and we already know that damage to the neuron network alter that animal being consciousness. From that we CANNOT conclude this is the only way to go. So speaking of "presumptuous" : back at you.
 
So you both agree that SRIP is necessary for consciousness, but you (like me) feel it isn't sufficient. So why not address that question, what else is required?

It seems to me that we are clarifying what our definitions of consciousness encompass. Pixy has a minimal definition that is broader in scope than most of us are comfortable with, and has asked what more we think is required to satisfy us.
It has an estimable historical foundation though - cogito ergo sum is clearly talking about self-referential information processing. :)

For my part, I don't yet know what else is required to produce what I would be comfortable calling consciousness, but I suspect it involves a computational architecture that allows a level of general purpose creative adaptive behaviour (as seen in many biological organisms with a CNS).
Yep.

The reason I settled on the single minimal definition is that (a) it's widely accepted as necessary, if not sufficient, and (b) when you start adding requirements you tend to run face first into this list:
http://en.wikipedia.org/wiki/Category:Neurological_disorders

And find that there are actual living breathing people who act like they are conscious in every way except what you were just proposing to add. Things like aboulia, ideational apraxia, or anterograde amnesia. (Never mind the truly weird and disturbing stuff like Capgras or Cotard delusion.)
 
If you read this Pixy, you may believe your meaning is clear when you repetitively and baldly post something like "Conscious is Self-Referential Information Processing" but in my opinion it is not. To me (at least for the first few readings) it sounded like some kind of strident conclusion without any real argument to back it up, but now I see that you are simply repeating the definition you have chosen to use (X years ago?). Do you have a pointer to where you first gave (and perhaps argued for?) this definition?

Newcomers to this topic might have the impression that Pixy presented a lot of complex arguments in support of the SRIP hypothesis, and that he's not bothering to produce them now. That isn't the case - he's been quite consistent throughout, and the same arguments and justifications have been put forward at all times.
 
You're a waste of resources, but I've never put you on ignore (or RD or any of you SRIP fruitcakes). It was cowardice.

I've never understood this. I have just about all of the Web on ignore, plus the vast, vast majority of all the books ever written, and nearly every person who's ever lived. That's a necessary condition of finite resources. If I happen to spend some time on a particular forum dealing with a particular subject, I'm certainly going to streamline it as much as I can. Since Pixy keeps saying the same things over and over, there's very little point in rereading them again and again. It wouldn't be worth it even if he were right.
 
I've never understood this. I have just about all of the Web on ignore, plus the vast, vast majority of all the books ever written, and nearly every person who's ever lived. That's a necessary condition of finite resources. If I happen to spend some time on a particular forum dealing with a particular subject, I'm certainly going to streamline it as much as I can. Since Pixy keeps saying the same things over and over, there's very little point in rereading them again and again. It wouldn't be worth it even if he were right.
Sure. Someone says something utterly nonsensical, I tell them that they are wrong and why. If they say the same nonsensical thing again, I tell them they are still wrong.

You don't need to keep reading unless you're the one saying the nonsensical things.
 
I wouldn't classify it as an hypothesis, but as an axiom.
Well, the problem is we have one type of well-defined behaviour - self-referential information processing - and another type of behaviour - consciousness - which is not nearly so well-defined.

If you start to clearly define what specific observable behaviours you categorise as consciousness, then self-referential information processing becomes a hypothesis. After all, you can't construct a hypothesis to explain something you haven't defined.
 
Newcomers to this topic might have the impression that Pixy presented a lot of complex arguments in support of the SRIP hypothesis
Why? It's not complex at all, if you understand the groundwork. If you don't understand the groundwork, then you need to read Godel, Escher, Bach.

and that he's not bothering to produce them now. That isn't the case - he's been quite consistent throughout, and the same arguments and justifications have been put forward at all times.
Well, yes, because I'm still waiting for a coherent counter-argument. See how patient I am?
 
Ichneumonwasp said:
What I'm saying is that what makes subjective experience subjective is that reflective activity. There's certainly more to the experience part.


Ah, OK, yes.
LOL. More to the experience part!

Thta's the part most here mean by consciousness that SRIP! Church-Turing! Read GEB! and Pixy/RD ignore and/or pretend doesn't exist.
 
I'd like to know what the thermodynamic properties of living organisms are while he's at it.

I already explained.

The life system has a higher chance of existing into the future than non-life systems, all else being equal, due to the behaviors it exhibits as a result of computations.

In other words, it does stuff to keep itself going, not the least of which is the controlled use of energy to maintain local order.
 
If you start to clearly define what specific observable behaviours you categorise as consciousness, then self-referential information processing becomes a hypothesis. After all, you can't construct a hypothesis to explain something you haven't defined.

But that process of "defining" seems pretty arbitrary to me. I could define something else, and we'd argue until eternity which was is better. Like Euclidean vs non-Euclidean geometry.

That doesn't mean I don't see the merit of your definition, but I don't see how it leads to any objectively testable hypothesis.
 
My point is that the biological functioning of neurons is an integral part of their ability to generate consciousness and, subsequently, conscious behavior. I also pointed out earlier that the thermodynamic properties of living organisms seem to be a base requirement of supporting conscious activity. Since we do not understand the biophysics of how these cells accomplish this it’s both presumptuous and premature to make determinations as to what "emergent functionalities" are necessary. Attempts at creating artificial conscious systems that ignore these issues are doomed to failure.

Your mistake is to think *EVERY* single process in that cell is related to consciousness. Actually a lot of process are common to a lot of cell in the body, or even across eukaryote. You haven't explained why, for example, the DNA repair mechanism would be necessary for consciousness, it is after all present in all eukaryote cell , as far as I can tell, or why the protein translation process would be necessary. To give another example, the cell repair process/protein making process is not specific to liver cell (as far as i remember) and is not necessary to explain its function. What is important is to know what additional function those cells do (and indeed very complex and differents functions). Same for say pancreas or muscle cell. Why would it be different for a neuron ?

Because we don't understand exactly what biophysical feature of neurons [or the body as a whole, for that matter] allows them to collectively produce and support conscious experience. Without this understanding we cannot design a relevant model, let alone a synthetic reproduction.

I also pointed out earlier that the thermodynamic properties of living organisms seem to be a base requirement of supporting conscious activity.

I am sorry but i would like to require evidence of that. *ALL* we have up to date is that we know only of only 1 type of being "conscious" and that is animals, and we already know that damage to the neuron network alter that animal being consciousness. From that we CANNOT conclude this is the only way to go. So speaking of "presumptuous" : back at you.

I never said that it is "the only way to go". I'm just pointing out the obvious fact that until we gain a sufficient understanding of the case examples of consciousness that we already know of [i.e. living animals] proposing to model and/or reproduce it is premature. Personally, I think its probably possible to create artificial systems that support consciousness, but the fact remains that we currently lack the scientific understanding to accomplish this yet.
 
Status
Not open for further replies.

Back
Top Bottom