• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
Presumably physicalists tend towards supervenient epiphenomena for thought just as idealists tend towards supervenient epiphenomena for energy/matter.

Thought at least seems to have a quality we could name 'intent', something the physical, as usually defined, lacks.


There are certainly many ways to pretend one has The Answer, or gloss over the fact one doesn't.

There are certainly many ways to pretend one has The Answer, or gloss over the fact one doesn't one of these is to throw in terms like "supervenient epiphenomena".
 
Yes we do.

You might not, but we do.

All this is, is a combination of the argument from personal incredulity and the argument from ignorance. You don't personally believe that we know something, therefore it's magic. Sorry, but logical fallacies don't cancel each other out that way.

But you're taking away the special! Meanie Materialist wanting to take all the mystery and colour out of life! Love!! Beauty!! Warm Soft Things!! None of these mean anything if we're just chemicals!!!:rolleyes:
 
Wasp, sequel to your detailed reply to Clive on pain complexity (post #3977), just focussing on "pain", do you think it's possible to state necessary and/or sufficient conditions for consciousness of pain, in terms of neurons? Is there a minimum level of electrochemical activity before a pattern of neurons firing is conscious for the subject? Does it ever make sense to speak of unconscious "pain" (neurons firing along dedicated pain pathways which the subject isn't conscious of; or perhaps anesthetized pain, where the enzymes that cause the pain neurons to fire are inhibited by drugs)? Beyond just pain, are there certain subsystems of the nervous system which must be involved for the subject to be conscious of the neurons firing (i.e., is activity in certain regions always unconscious; always conscious)?


Yes, I think we can speak about unconscious pain, and I think it is revealing. You may have been hit with something so painful -- like putting your hand on a burning stove -- that you removed your hand reflexively. And then you felt the pain afterwards.

I think it makes sense to speak of unconscious pain in this situation. There is no motivation to act, to do something; instead there is stimulus response.

As to what minimum amount of processing is necessary, I have no idea.

Which systems are necessary for consciousness? I think the only thing we can say with confidence is that the cingulate is necessary for the suffering aspects -- but there is clearly more to pain than simply that part of the picture.


Maybe a pointless sidetrack, but I'm curious how much "consciousness" can be pinned down to certain regions of the brain / nervous system; certain threshholds of nervous activity; perhaps even certain patterns of enzyme catalysis or neurons firing? It's well known of course that certain frequencies of neurons firing as measured by EEGs are indicative of certain types of consciousness -- (Beta [12-30 Hz] and active concentration, for example) -- and certain regions more active than others in these states: can we be any more specific (layman deferring to your expertise; you'd hinted at an answer in your earlier reply re the five pain pathways)?


Not that I'm aware of. I would think that certain areas have to be involved but there are so many that it would amount to a run down of most of the cortex and subcortex.

I was fascinated by your discussion of pain asymbolia, where patients can feel pain yet not suffer, not be repelled or motivated by feeling it. Naively, I would have expected the pain 'quale' was the motivation for acting to alleviate the pain. Yet it appears the consciousness of the pain, the 'quale', can be separated from the motivation to act on it. Any complementary cases you know of, where patients react to 'painful' stimuli that they don't 'feel' ("painful" here would translate to what would cause pain absent the condition: not sure what to call it... "pain pzombia"?)?

Whew, that's a lot of questions... sorry. :blush: (& a slightly premature happy new year!) ETA: whoops... looks like you've answered most of my last question in a subsequent reply to Orbini (post #3978). *sigh* just ignore me... :dig:



Happy New Year to you too.

Yes, unfortunately there does not appear to be any simple answer with pain. The motivation to act is the suffering aspect but that just doesn't cover the issue of feeling pain intensity, which is certainly part of the conscious experience.

I was hoping that any discussion of the suffering might move things along, since that is a huge part of what we mean by the phrase "what pain is like". Since the suffering aspect of it appears to be something different from what we might expect -- it seems to be a motivation to act rather than just some vague sensory notion -- I hoped we might use that to view all 'feelings' in a way that can bridge this supposedly unbridgeable gap between neuron firings and qualia. I'm not sure that is going to work, though, reading some of the other discussions.
 
But you're taking away the special! Meanie Materialist wanting to take all the mystery and colour out of life! Love!! Beauty!! Warm Soft Things!! None of these mean anything if we're just chemicals!!!:rolleyes:
Taking away the special gives my life meaning.
 
I think it makes sense to speak of unconscious pain in this situation. There is no motivation to act, to do something; instead there is stimulus response.

As to what minimum amount of processing is necessary, I have no idea.

I'd say that it depends greatly on the desired complexity of the response. In a simple stimulus response, hardly any processing is necessary.

In more complex cases, the organism may have several options on how to deal with the pain, and the best option depends on the circumstances. If you have a sore tooth, pulling it out may the best option, even if it temporarily increases the pain. That kind of decision involves long term planning and insight, which take a lot of processing.

So, there's no definitive answer. The absolute minimum is zero (no response at all), and it can go up as far as you want.
 
See Pixy's comments above. But then your statement:

"A computational simulation of a generator will not produce electricity, and a structural replica of a radio will not pick up stations."

has nothing to do with whether we can, in fact, simulate a working brain. You went on to say:

"We need to grasp the physics of what living brains are doing with regard to consciousness in order to know how to create artificial systems that have the same capabilities."

which is certainly true. But it may turn out there is no special physics and a conventional computer simulation will do the trick just fine.
~~ Paul

That hypothesis rest solely upon the assumption that consciouses vis consciousness is computation. As I've already said, understanding the computational architecture of the brain can only give us information as to how our conscious experiences are organized; it does nothing in the way of understanding what it is in physical terms. Such an understanding is absolutely necessary in order to gain the technical knowledge required to seriouly propose how to instantiate it artificially and to know what systems/substrates would be necessary & sufficient to meet those requirements.
 
That hypothesis rest solely upon the assumption that consciouses vis consciousness is computation. As I've already said, understanding the computational architecture of the brain can only give us information as to how our conscious experiences are organized; it does nothing in the way of understanding what it is in physical terms. Such an understanding is absolutely necessary in order to gain the technical knowledge required to seriouly propose how to instantiate it artificially and to know what systems/substrates would be necessary & sufficient to meet those requirements.

Or you can just go ahead and try to instantiate it artificially, and see if it works. If it doesn't work, we may get a clue where to fix it. That approach is preferable over just thinking about it, and saying it can't be done.
 
Sticking with the radio example: If we have an accurate computer model of a radio -- sans the necessary physical materials assembled in proper fashion -- do you think that it will pick up radio stations?

No, but I never suggested anything like that. What's the obsession around here with bad analogies ?

I'm talking about a model, running on physical hardware, connected to the outside world through suitable I/O converters. I never suggested leaving out the physical materials, or the I/O converters, so please don't insist on doing so.

To stick with a better analogy: if we have an accurate model of a piano, down to every detail, and the model can calculate exactly how the string would vibrate, and how that vibration would be transferred to the surrounding air, and we run that model on a physical computer, and attach a speaker to our sound card, would it sound like a piano ?

Of course. We already understand the physical basis is for the stimuli we interpret as sound: they are vibrations of a particular range of frequencies propagating thru some medium to stimulate our auditory sensory organs.

We do not posses that same level of understanding with regard to consciousness, which translates those vibrations into subjective qualities like sound. And make no mistake, those vibrations are not 'sound' until come conscious entity perceives them as such. Can you explain to me how any given array of "suitable I/O converters" can be made to perceive vibrations as the sensation of sound? Hows about tuning it to perceive EM radiation as color? Better yet, just explain to me how it can be made to perceive any given stimuli as anything at all?

The simple fact of the matter is that we do not understand the physics of how physical brains produce such things.
 
Can you explain to me how any given array of "suitable I/O converters" can be made to perceive vibrations as the sensation of sound?

A microphone.

Hows about tuning it to perceive EM radiation as color?

A camera.

The simple fact of the matter is that we do not understand the physics of how physical brains produce such things.

Sure we do. We know how the ear transforms sounds into small electrical pulses. We know how the retinal cells convert light into similar pulses. We can figure out how these propagate and how they are processes by carefully looking at the brain cells.

You keep coming back to "consciousness", but you forget that I'm not interested in that. I only want to solve the "easy problem", namely reconstructing purely functional, behavioral model of the brain.

Sound goes into a microphone, it gets processed, and after a while, a voice comes from the speaker and says: "nice piano music, dude".
 
Or you can just go ahead and try to instantiate it artificially, and see if it works. If it doesn't work, we may get a clue where to fix it. That approach is preferable over just thinking about it, and saying it can't be done.

Theres a world of difference between claiming that synthetic consciousness cannot be achieved and pointing that we lack the requisite knowledge to achieve it.

TBPH, AI researchers have been trying for decades to produce consciousness via the computationalist route and have yet to produce anything even remotely resembling a p-zombie. I've already proposed reasonable physical criteria for determining whether or not a system meets what appears to be the necessary conditions of a conscious system. I've also explained why a non-conscious system cannot self-generate a convincing repertoire of behaviors so don't construe what I'm saying as just poo-pooing negativism. There is substance to what I'm arguing and you know it.
 
TBPH, AI researchers have been trying for decades to produce consciousness via the computationalist route and have yet to produce anything even remotely resembling a p-zombie.

Agreed. They used the wrong methods.

I'm suggesting we start with a good model of a neuron, and then combine more and more, until we have a functional model of a brain. Ignore consciousness, and just look at the physics. Why would that go wrong ?
 
Can you explain to me how any given array of "suitable I/O converters" can be made to perceive vibrations as the sensation of sound?

A microphone.

What makes a mircrophone perceive vibrations as sound while a human corpse does not? An ear is essentially a microphone but all a microphone does is propagate the vibration pattern; it does not produce sound.

Hows about tuning it to perceive EM radiation as color?

A camera.

By that logic simply opening a dead man's eye will produce the perception of color. An eye is a camera but EM radioation does not become color until someone conscious perceives it as such.

The simple fact of the matter is that we do not understand the physics of how physical brains produce such things.

Sure we do. We know how the ear transforms sounds into small electrical pulses. We know how the retinal cells convert light into similar pulses. We can figure out how these propagate and how they are processes by carefully looking at the brain cells.

There are literally trillions of different impulses being propagated not only by the nervous system but across the membranes of numerous other tissue types. As of now we do not understand the physics of what makes some of these impulses become perceptions of particular sensations, or even how such things as sensations are related to physics in general.

You keep coming back to "consciousness", but you forget that I'm not interested in that.

So why are you participating in a discussion about consciousness?


I only want to solve the "easy problem", namely reconstructing purely functional, behavioral model of the brain.

Sound goes into a microphone, it gets processed, and after a while, a voice comes from the speaker and says: "nice piano music, dude".

Without consciousness there is no perception of 'piano music' or any appreciation of it being 'nice'. What you're talking about is cobbling together a sophisticated toy that gives a range of canned responses to amuse English speaking humans who happen to be conscious themselves.
 
Agreed. They used the wrong methods.

I'm suggesting we start with a good model of a neuron, and then combine more and more, until we have a functional model of a brain. Ignore consciousness, and just look at the physics. Why would that go wrong ?

I'm saying consciousness itself is an overlooked aspect of what we understand to be physics.
 
What makes a mircrophone perceive vibrations as sound while a human corpse does not? An ear is essentially a microphone but all a microphone does is propagate the vibration pattern; it does not produce sound.

You were asking about I/O conversion. That's what a microphone is for. I was hoping it was obvious that the rest of the processing is done by the computer, and not be the microphone. Compare the microphone to an ear, and the brain to the rest of the computer. I didn't expect this to be that difficult.

So why are you participating in a discussion about consciousness?
Analyzing functional behavior is the first step. Some people call it the "easy" problem, and would like to skip it, but I think it's actually quite hard, so lets work on that first, and learn something valuable in the process.

What you're talking about is cobbling together a sophisticated toy that gives a range of canned responses to amuse English speaking humans who happen to be conscious themselves.

I'm talking about making a functional replica of the brain, neuron by neuron, on a purely physical basis. That has nothing to do with canned responses.
 
But you never explain why a physical replica of a neuron won't work just as well as the original.

Wait, you mean actually constructing biological neurons and assembling them into a brain? I gotta tell you I got some serious doubts about such a thing producing a 'functional replica' -- atleast in the way you seem to conceive of it.

Why not simply put that hypothesis to the test and just try to reanimate a cadaver? :confused:
 
Wait, you mean actually constructing biological neurons and assembling them into a brain? I gotta tell you I got some serious doubts about such a thing producing a 'functional replica' -- atleast in the way you seem to conceive of it.

I thought it was obvious from the last dozen posts that I was talking about a computer running a functional model of a single neuron. We mimic exactly the function, but using a different physical substrate. At the edges, where the neuron interfaces with the rest, we use a suitable I/O converter to translate the digital numbers in the computer to electrical impulses, compatible with a real neuron's connections.

If we could make the electronics small enough, we could in theory remove a neuron from a person's brain and replace it with a tiny microchip, restoring all the connections to neighboring neurons. Inside the chip, it would work completely differently, but looking from the outside, it would behave exactly the same.

Clear ?
 
There are certainly many ways to pretend one has The Answer, or gloss over the fact one doesn't one of these is to throw in terms like "supervenient epiphenomena".
Unlike "consciousness" those words do have a bit of sharable meaning. ;)
 
Status
Not open for further replies.

Back
Top Bottom