Explain consciousness to the layman.

Status
Not open for further replies.
Don't we start with a non-materialist theory of mind? It's practically how we define "mind," as in such constructs as "mind over matter."

The junk that goes on in my mind seems so much in excess of what's required for survival that I do wonder if the excess needs an explanation. In fact per some reported experiences people function better while "unconscious." By that I mean, the narrator disappears, an athlete's "in the zone," the archer and the arrow become one ...

Is that computational? It's not the perfect computing that throws me, it's all the irrelevant junk that seems to interfere with the perfect computing.
 
If people become emotionally attached to their Roomba cleaning robots, there shouldn't be a problem with robot AIs

Roombas are darling, just precious, although mine has an owie on his foot, poor little guy.
 
Not really - not only would you not be exact duplicates for very long after duplication, but during that period it would be reasonable to suppose you would both have apparent continuity of experience and therefore an identical and rational fear of death, just as the original did. Also, when you each see the other and come to rationalize your situation, your apparent continuity of experience means you would initially still both feel you were the original - and certainly, at the point one of you realizes he really is/isn't the original, you are no longer exact duplicates... :)

I entirely agree, but there are some people entirely wedded to the idea of a spare rendering them redundant.
 
That seems to be a strawman of your creation.

I suggest a perusal of the relevant threads. I was accused of being entirely irrational for not regarding the existence of another self at the other end of the transporter as being good reason to allow myself to be vapourised.
 
Westprog is quite correct - that viewpoint has been put forward on more than one occasion, and people who disagreed have been labelled as irrational. And while our reasons may be different, I agree with his position on this.
 
Last edited:
I suggest a perusal of the relevant threads. I was accused of being entirely irrational for not regarding the existence of another self at the other end of the transporter as being good reason to allow myself to be vapourised.

You are correct. I misread your post.
 
Westprog is quite correct - that viewpoint has been put forward on more than one occasion, and people who disagreed have been labelled as irrational. And while our reasons may be different, I agree with his position on this.

Quite right. I read more into his post than was in it.:o
 
My statement that consciousness is unexplained is in conflict with the people who claim that there is an explanation, and that it is SRIP - computation - and that there is no need to consider anything else. It is not in conflict with people who think that this is a possible or even probable explanation.

No, that's not what I meant. You said:

The claim that consciousness is computational is undoubtedly stronger than my claim that consciousness is currently unexplained.

Knowing that it's computational doesn't explain it fully. Ergo the two are not mutually exlusive.
 
Two very interesting research articles on subjective experience.

I think Paul C. Anagnostopoulos posted them up in the past.

http://philsci-archive.pitt.edu/4888/1/Two_Conceptions_of_Subjective_Experience.pdf

http://web.gc.cuny.edu/cogsci/private/Churchland-chimeric-colors.pdf

I printed out the article on chimeric colors and, sure enough, can produce them. The point of bringing it up here is to show how the brain creates qualia separate from the physical world. We CAN see colors which cannot be produced with wavelengths of light. The belief that we do is central to the impression that the mind is not computational. We imagine that there's somehow a green color in the brain when we look at green grass, but the fact is, there is not. The green quale is a complete fantasy our brain creates via data processing. There's no magical greenness happening in the mind, and it's incidental that we've learned to correlate it with green objects and green light. Subjective experiences are the internal experiences of the information processing of the brain. We look at a ripe strawberry and think there's some ripe strawberriness happening in our mind, but there isn't -- just numbers in registers (Dennett).

One summer at noon when the sun was blazing down on a garden, a lunch mate set me up over a bush of bright magenta flowers. I stared into the flowers until they turned white, after about a minute, then looked at the green grass. Wow! The grass was an extreme chimeric super-green. It reminded me of being on acid when I was a teen. Neurons overshooting because of exhaustion in the first example, or a drug in the second. No super-green super-chlorophyll in the mind. Just unusually big numbers in some of the registers that specify the color.

When we listen to music, there's no music in the brain. When we feel coldness there's no part of the brain that gets cold. We INTUIT that but it's an illusion. When you finally GET this you understand how the mind is, indeed, computational.
 
What's a "living" computer ?

Alive in the biological sense, we have an example of life to refer to on this planet.

My suggestion is that "consciousness" is a function of being alive in this sense and an algorithm simulation of this can only mimic it.
 
Alive in the biological sense, we have an example of life to refer to on this planet.

My suggestion is that "consciousness" is a function of being alive in this sense and an algorithm simulation of this can only mimic it.
How can you tell the difference? Why do you think there even is a difference?
 
I printed out the article on chimeric colors and, sure enough, can produce them. The point of bringing it up here is to show how the brain creates qualia separate from the physical world. We CAN see colors which cannot be produced with wavelengths of light. The belief that we do is central to the impression that the mind is not computational. We imagine that there's somehow a green color in the brain when we look at green grass, but the fact is, there is not. The green quale is a complete fantasy our brain creates via data processing. There's no magical greenness happening in the mind, and it's incidental that we've learned to correlate it with green objects and green light. Subjective experiences are the internal experiences of the information processing of the brain. We look at a ripe strawberry and think there's some ripe strawberriness happening in our mind, but there isn't -- just numbers in registers (Dennett).

One summer at noon when the sun was blazing down on a garden, a lunch mate set me up over a bush of bright magenta flowers. I stared into the flowers until they turned white, after about a minute, then looked at the green grass. Wow! The grass was an extreme chimeric super-green. It reminded me of being on acid when I was a teen. Neurons overshooting because of exhaustion in the first example, or a drug in the second. No super-green super-chlorophyll in the mind. Just unusually big numbers in some of the registers that specify the color.

When we listen to music, there's no music in the brain. When we feel coldness there's no part of the brain that gets cold. We INTUIT that but it's an illusion. When you finally GET this you understand how the mind is, indeed, computational.

This seemed perfectly reasonable, apart from the non sequitor in the last sentence. The fact that the brain doesn't purely produce an exact representation of sensory input doesn't imply that the process it uses is computational, either in the formal (Turing machine) sense, or an informal way (works sort of like a digital computer).
 
This seemed perfectly reasonable, apart from the non sequitor in the last sentence. The fact that the brain doesn't purely produce an exact representation of sensory input doesn't imply that the process it uses is computational, either in the formal (Turing machine) sense, or an informal way (works sort of like a digital computer).


I wish you understood something about many of the words that you keep using.
 
I wish you understood something about many of the words that you keep using.

I had a peek at this post because I wondered "Does Complexity have something to contribute to this?" Glad to see some things don't change.
 
I had a peek at this post because I wondered "Does Complexity have something to contribute to this?" Glad to see some things don't change.


You don't seem to understand very much about science, including computer science, yet that doesn't discourage you from sharing your opinions with us on these and other matters.

I really wish you'd find something better to do.
 
The fact that the brain doesn't purely produce an exact representation of sensory input doesn't imply that the process it uses is computational, either in the formal (Turing machine) sense, or an informal way (works sort of like a digital computer).
The brain probably does not literally operate like a digital computer or a Turing machine.

But, whatever processes the brain is using can be emulated on a digital computer or a Turing machine (or a universal computing machine of any sort).

In theory: If we can ever duplicated the process, exactly, it wouldn't be different from natural consciousness. It would be an actual consciousness, artificially induced.

If that theory turns out to be wrong (which is unlikely), then at least we will still learn a lot about how consciousness does work in the process of trying to emulate it.
 
Last edited:
Status
Not open for further replies.

Back
Top Bottom