But presumably the C' that corresponds to C = pleasure is what causes me to prolong the activity, because there is some evolved reason to prolong the activity other than its pleasure-producing quality. The activity must be "pleasureful" in a purely zombie-like manner.
What we have to try to picture are zombie-reasons why we do everything we do, with the phenomenal experience merely an added quality. Of course, if what Edelman means by "informational" is that a memory of the phenomenal experience can be formed, then that changes the whole thing.
I wish I could find his email address.
~~ Paul
Yes. Sorry; I shouldn't be derailing this into a debate on the plausibility of epiphenomenalism, when your OP was only asking about what Edelman, who sounds like some version of epiphenomenalist, meant by a certain phrase. (I am baffled how information, the painful or pleasant qualia that inform us of our mood, could be acausal; unless by "thermodynamically acausal" he means it all reduces to the physics of neurons, that's where causation occurs, with consciousness just another way of modelling it, that needn't reference thermodynamics [albeit the underlying physical changes it maps to must be consistent with thermodynamics]).
The hard problem of consciousness is getting a useful definition of the word out of people. It's like pulling teeth. From the wishy-washy sorts of definitions dualists tend to give I don't believe that I have it. Using more trivial definitions of consciousness such as processing that deals with the self there's no controversy that people are conscious; but then the hard problem of consciousness degenerates into "why is it necessary to have processing that concerns the self in order to have processing that concerns the self?".
To echo Paul (post #23) -- brain functioning [consciousness]:
really complicated.
I don't see why processing of inputs "straightaway as data" would be different than "experiencing data".
I didn't phrase nor think that through very well. I mean processing the data strictly as computation: numerically to logically to mechanically and back up again. For people, it seems data can be processed unconsciously (e.g., searching one's memory, performing habitual tasks, monitoring of vital systems, in dreamless sleep, roughly equivalent to "computational -- mechanical"); or consciously, experienced as integrated sense input ("I feel cold" / bored" / hungry" / ...), as 'negative' pain vs 'positive' pleasure, etc.
There. I
think that makes sense, though with consciousness it's hard to be sure.
You're supposing that the equivalent processing could more easily be done without "experience"(whatever this word means).
Some segment or duration of consciousness.
I don't believe so; I believe people are p-zombies.
A less clumsy way of saying this is perhaps this: If a p-zombie version of me sees the colour red it will not only insist up and down that it saw the colour red, it will believe that it experienced the colour red(because otherwise if it really behaved exactly like me it would report that it did not experience redness). In other words, a p-zombie can't tell that it is a p-zombie and not a "real person"; why then do you believe you are a "real person" and not just a p-zombie who thinks and acts like it experiences when all it is doing is merely processing data?
Well, to say we are p-zombies begs the question: is there such a thing as consciousness and experience? If there isn't, then eliminative materialism is true, and we
are p-zombies. If there
is, then either dualism is true, or "p-zombies" are incoherent.
I believe that "consciousness" makes a meaningful distinction between many states, being asleep and being awake the most obvious, so I believe I am sometimes conscious and have experiences. As consciousness' dependence on the brain and body is adequate evidence against dualism, I conclude p-zombies can't exist and that I am something different: a "real person".