• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Are You Conscious?

Are you concious?

  • Of course, what a stupid question

    Votes: 89 61.8%
  • Maybe

    Votes: 40 27.8%
  • No

    Votes: 15 10.4%

  • Total voters
    144
If my central point is that we are lacking the knowledge of the "specific physical thing" that is a sufficient indicator of consciousness why in blue barfing blazes would I then turn around and claim knowledge of what it is?
That is the point -- there is no reason at all to assume that consciousness is a specific physical thing.

The substitutes must be able to at least chemically mimic the "natural" signal molecules in order to produce similar effects.
Why? Yes, the receptor sites have to have the same functional effect they currently do, but beyond that constraint we could (in theory) replace it with (say) pushrods, direct electrical connections, optical interconnects, whatever -- as long as the appropriate receptors excite or inhibit the target neuron to the same degree they currently do, nothing would have changed.

Whatever devices are used they must inhere the relevant physical properties that allow our biological neurons to produce consciousness.
Why? Why would mere functional equivalence not suffice?

Being as how we do not know the physical "whats" and "hows" of consciousness we have no way of knowing what hardware systems would be sufficient beyond our own. Its common flipping sense, dude.
If it was "common flipping sense", then we would not be having this conversation.

It's entirely unjustified to assume that merely emulating the computational functions of our neurons is sufficient to produce consciousness -- especially when we have not yet discovered what consciousness is or how it is produced in the first place.
I simply think that talking about consciousness as if it were a special property of our neurochemistry of individual neurons missing the forest for the chloroplasts. I think that consciousness arises as a consequence of the overarching neural architecture of our brains, and that we should be able to replicate that on a substrate that is not based on our biochemistry.

Even assuming that we actual do learn what physically constitutes consciousness, simulating it would not reproduce it anymore than a computer simulation of gravity produces actual gravitational effects.
Again, we do not know that consciousness has something that specifically constitutes it at a physical level. The only good definitions we have are essentially behavioral ones.

You CANNOT engineer a feature into a system without having a rudimentary understanding of it or, at the very least, have the ability to physically identify it.
We do have at least a rudimentary understanding of consciousness as humans implement it. This intro to psych course provides a decent overview of how much we know as of late 2004. It does not get into philosophy, and I regard that as a Good Thing.

For the life of me, I don't get why you are so resistant to facing this fact.
Because it is not a fact, it is your personal opinion.

Right. So basically you're saying that the chemical properties of neurotransmitters, and the physical conditions of biological brains have absolutely no relevance to the production of sensations, emotions, or other subjective states.
Oh, they have relevance, but at the level of describing the mechanics of how brains work, not at the level of describing the abstract behaviors those mechanics implement.

You can't even address the basics of what qualia are, or how they are produced, yet you insist that creating them is a simple matter of "engineering".
Actually, I think that qualia is a useless philosophical term of art that has little relevance to what is actually happening in the brain.

Just who do you think you're kidding?
No-one. I am participating in this conversation primarily for my own amusement.

I never said they are the only possible ones.
Good, then you don't have a problem with replacing them with some other mechanism, or abstracting them away entirely.

The point is to find out WHAT consciousness is and HOW that "synaptic activity" produces it in the first place.
I think that describing consciousness in terms of synaptic activity would be an epic waste of time, for the same reason that describing Windows 7 in terms of circuit layouts would be.

Oh my god! Could you -please- spare me the constant appeals to "but we can simulate it!".
Not until you can explain why a sufficiently detailed simulation of a conscious system would not be conscious without resorting to your usual red herring objections. After all, a simulation of an information processing system still processes information.

First of all, we do not know what consciousness is to begin with, so claims of knowledge of how to simulate it are complete ****ing rubbish.
As are claims that it must have something special to do with the details of our neurochemistry.

Second of all, even if we did have the knowledge required to design such a simulation, simulation itself is, in principle, NOT a reproduction.
In principal, yes. In the real world when dealing with specialized information processing systems, not so much.

The reason I think simulation will be sufficient is because the primary output of conscious systems is behavior that other conscious systems recognize as conscious.

Your entire position basically boils down to: "Brains compute, therefore consciousness is a prior computation. All that is needed to produce consciousness is to emulate the computations of the brain and call it a day."
If by "All that" you mean "One way". I think that brute-force simulation at a neuron-by-neuron will be useful primarily as a stepping stone, if we have to do it at all.


Yet, in the same breath, you'll handwave away the significance of those conditions or even the need to understand how they relate to consciousness.
No, I just to not assign the same significance that you do to that low-level phenomena.

Stop lying to yourself.
I am not.


Its a flat fact that we do not know what consciousness is or understand how the chemistry/physics of the brain produces subjective experience.
I do not think we will find the answers to subjective experiences at the level of the chemistry of physics of the brain. I think we will find it at the level of the information processing properties of the brain. Stop insisting that we live in a state of complete ignorance on the problem of subjective experience or that your approach is the only valid one.

Your claim that computation is a sufficient explanation of consciousness is not only a -belief-, its a completely unjustified one at that.
And if that was exactly what I was claiming, your rancor might be justified. I claim that computation is necessary, just like chemistry and physics are. I claim that we have not found what the sufficient constraints on computation (or physics or chemistry) are.


So, by your criteria, how would one go about discerning if a nematode has subjective experiences and if so, what the range of its experiences are, what it will experience given a particular stimulus, and what its experiencing during a given period of time?
My model cannot answer these questions, as it focuses on behaviour rather than trying to account for details of subjective experience.

If you cannot answer these questions you do not have a sufficient theory of consciousness, and all your handwaving bluster about computational criteria amounts to nothing more than a pile of empty platitudes.
We do not have enough information to formulate any sort of test for consciousness by your criteria, so why should I rely on any test created at our current level of knowledge that relies on anything but self reporting for subjective experiences, knowing that it can be fooled by a program that prints "I think therefore I am"? Behavioral tests are much more useful right now.

Yet you can't tell me the first thing about what those allegedly conscious systems are experiencing or how your "criteria" even relate to those experiences. Get real.
Yup -- I focus on behaviors not "experiences", subjective or otherwise.

In other words, when you were claiming that you knew the sufficient criteria for discerning consciousness you were just talking outta your behind.
Again, behavior, not subjective experience.

You've gotta be kidding me. You have a theory of consciousness that explains nothing, criteria for discerning consciousness that can't even tell us if a nematode has subjective experiences and you seriously consider your "model" to be the epistemic equivalent of an internal combustion engine? Are you completely daft?
Only on the Internet, apparently.
 
No, that's not the case. When engineers design real-time control systems, they do so to an abstract model. That's how all engineers and scientists work. It's just that their model includes a time concept, and the Turing machine model does not.

And that doesn't matter. Because the Turing machine model is just an abstraction. And it can deal with time as an abstraction in terms of order.



Or we can apply an abstract concept of time, and attempt to model what the brain is actually doing, rather than trying to shoehorn its functionality into the wrong model. The only reason for trying to model what the brain does in terms of a Turing machine is because Turing machines are how we think about computability. The concept of Turing machines wasn't obtained by looking at what brains do, because brains work in a very different way.


Yes, that is correct. But I don't see that as an important thing to argue because the whole idea of thinking in terms of Turing machines is only to see that we can model this in other systems. I don't see how thinking about this in terms of a Turing machine provides any other advantage. If we wanted to really model it in Turing machine terms the model would be hopelessly complex, which is why no one bothers to do so. The only useful argument about Turing computability is to show, as above, that the computation is possible in another system because it can be computed in abstract terms. That abstraction is not conscious. It can't be conscious. It doesn't exist.



If time dependence is an issue, then we need a model that takes it into account. There's no particular problem with this. Many physical equations include a little "t" in them. That's modelling time. Why is this not possible?


Who said it isn't?



No, it's because of the fundamental function of the brain. The brain and nervous system controls the body in real time. It has to respond in a given time or else we would stop functioning and die. Considering the brain as something performing an algorithm which will eventually produce a correct result is entirely inapplicable. It has quite a different function.

No, it goes down to the level of neuron function.
 
Ichneumonwasp said:
No, it's because of the fundamental function of the brain. The brain and nervous system controls the body in real time. It has to respond in a given time or else we would stop functioning and die. Considering the brain as something performing an algorithm which will eventually produce a correct result is entirely inapplicable. It has quite a different function.

No, it goes down to the level of neuron function.
Shouldn't the sentence be 'Considering something performing an algorithm which will eventually produce a correct result as (analogous to) the brain is entirely inapplicable.'?

That's the real time problem. For example, why would a human agree that computations that took a computer 10 years rather than 10 milliseconds for a brain is conscious?
 
That's the real time problem. For example, why would a human agree that computations that took a computer 10 years rather than 10 milliseconds for a brain is conscious?

Because general relativity states that if you are on a ship travelling 0.9999999~ C relative to me, something that takes my brain 10 milliseconds would take your brain more than 10 years, from my perspective.

Am I to conclude that you are no longer conscious, because time diliation has lowered the rates of things on your end relative to mine? Are you to conclude that I am no longer conscious, because my entire conscious life is compressed into less than a second of your perception?

(of course, if you just ignore science, and live by this dumb ol' "wall clock" mentality, you don't need to address these kinds of situations)
 
Last edited:
Because general relativity states that if you are on a ship travelling 0.9999999~ C relative to me, something that takes my brain 10 milliseconds would take your brain more than 10 years, from my perspective.

Am I to conclude that you are no longer conscious, because time diliation has lowered the rates of things on your end relative to mine? Are you to conclude that I am no longer conscious, because my entire conscious life is compressed into less than a second of your perception?

(of course, if you just ignore science, and live by this dumb ol' "wall clock" mentality, you don't need to address these kinds of situations)
Irrelevant. We are both conscious within our conception of wallclock time, and in the conception of other beings in our specific-to-us, albeit different, reference frames.
 
nescafe said:
If my central point is that we are lacking the knowledge of the "specific physical thing" that is a sufficient indicator of consciousness why in blue barfing blazes would I then turn around and claim knowledge of what it is?

That is the point -- there is no reason at all to assume that consciousness is a specific physical thing.

Except that its limited to specific physical systems in specific physical states.

The substitutes must be able to at least chemically mimic the "natural" signal molecules in order to produce similar effects.

Why? Yes, the receptor sites have to have the same functional effect they currently do, but beyond that constraint we could (in theory) replace it with (say) pushrods, direct electrical connections, optical interconnects, whatever -- as long as the appropriate receptors excite or inhibit the target neuron to the same degree they currently do, nothing would have changed.

Translation: "Except for completely changing the physical substrate and mechanisms nothing would have changed." :rolleyes:

Whatever devices are used they must inhere the relevant physical properties that allow our biological neurons to produce consciousness.

Why? Why would mere functional equivalence not suffice?

Because emulating functionalities of an object/process is not reproducing said object/process, genius.

Being as how we do not know the physical "whats" and "hows" of consciousness we have no way of knowing what hardware systems would be sufficient beyond our own. Its common flipping sense, dude.

If it was "common flipping sense", then we would not be having this conversation.

Or if you actually possessed it to begin with...

It's entirely unjustified to assume that merely emulating the computational functions of our neurons is sufficient to produce consciousness -- especially when we have not yet discovered what consciousness is or how it is produced in the first place.

I simply think that talking about consciousness as if it were a special property of our neurochemistry of individual neurons missing the forest for the chloroplasts. I think that consciousness arises as a consequence of the overarching neural architecture of our brains, and that we should be able to replicate that on a substrate that is not based on our biochemistry.

Consciousness is no more a product of brain "architecture" than photosynthesis is a product of canopy "architecture".

Even assuming that we actual do learn what physically constitutes consciousness, simulating it would not reproduce it anymore than a computer simulation of gravity produces actual gravitational effects.

Again, we do not know that consciousness has something that specifically constitutes it at a physical level. The only good definitions we have are essentially behavioral ones.

How are they "good definitions" if they completely ignore -the- essential feature of consciousness [i.e. subjective experience]? By those "good definitions", you would be considered unconscious if you were completely paralyzed.

You CANNOT engineer a feature into a system without having a rudimentary understanding of it or, at the very least, have the ability to physically identify it.

We do have at least a rudimentary understanding of consciousness as humans implement it. This intro to psych course provides a decent overview of how much we know as of late 2004. It does not get into philosophy, and I regard that as a Good Thing.

I reckon so. Fools tend to suck at philosophy anyway.

For the life of me, I don't get why you are so resistant to facing this fact.

Because it is not a fact, it is your personal opinion.

For instance, they confuse facts they don't wish to accept with "opinion".

Right. So basically you're saying that the chemical properties of neurotransmitters, and the physical conditions of biological brains have absolutely no relevance to the production of sensations, emotions, or other subjective states.

Oh, they have relevance, but at the level of describing the mechanics of how brains work, not at the level of describing the abstract behaviors those mechanics implement.

Right. They're relevant at the level of what the brain actually does but not at the level of your abstractions. [And WTF is "abstract behavior" supposed to be referring to? Subjective experiences -- a.k.a. qualia?]

You can't even address the basics of what qualia are, or how they are produced, yet you insist that creating them is a simple matter of "engineering".

Actually, I think that qualia is a useless philosophical term of art that has little relevance to what is actually happening in the brain.

ROFL!

You're ****ing with me, right? So in your view, subjective experiences have little relevance to whats actually happening in the brain? If thats your honest opinion you really -are- daft. I can see now this "discussion" has been an exercise in futility. If I'd realized before now that you're a complete fool I wouldn't have wasted my time...

You've gotta be kidding me. You have a theory of consciousness that explains nothing, criteria for discerning consciousness that can't even tell us if a nematode has subjective experiences and you seriously consider your "model" to be the epistemic equivalent of an internal combustion engine? Are you completely daft?

Only on the Internet, apparently.

I sincerely hope for your sake that you're not as weak-minded IRL.
 
Last edited:
The best evidence so far is that it isn't. It may be that it will turn out that some new theory will change things, but as things stand the universe looks as if it isn't computable.

I dug this up as I was perusing some of the earlier parts of this thread.

I'm curious about this Westprog. What exactly do you mean by this, that the universe is not computable? And what explained Aku's position that it is?
 
I dug this up as I was perusing some of the earlier parts of this thread.

I'm curious about this Westprog. What exactly do you mean by this, that the universe is not computable? And what explained Aku's position that it is?

What I was saying is that one can simulate and or model anything in the universe via computation but that computation itself does not physically reproduce whatever is being modeled. I think westprog's point was that it is impossible to have an arbitrarily accurate simulation of the universe.
 
Except that its limited to specific physical systems in specific physical states.
Just like everything else.

Translation: "Except for completely changing the physical substrate and mechanisms nothing would have changed." :rolleyes:
In terms of ability to process information, yes.

Because emulating functionalities of an object/process is not reproducing said object/process, genius.
In specific cases, it is. Especially when that process is an information processing process.

Or if you actually possessed it to begin with...
I do well enough, thankyouverymuch.

Consciousness is no more a product of brain "architecture" than photosynthesis is a product of canopy "architecture".
Why not?

How are they "good definitions" if they completely ignore -the- essential feature of consciousness [i.e. subjective experience]?
They do not, they just thing that subjective experience is also a behavior.

By those "good definitions", you would be considered unconscious if you were completely paralyzed.
Yep. False positives and false negatives suck, and "perfect" definitions and tests do not exist.

I reckon so. Fools tend to suck at philosophy anyway.
I don't share your opinions, therefore I am a fool?

For instance, they confuse facts they don't wish to accept with "opinion".
I do not blindly accept your pronouncements as facts, therefore I am doubly a fool?

Right. They're relevant at the level of what the brain actually does but not at the level of your abstractions.
No, just that there is a difference between neurochemistry and information processing.

[And WTF is "abstract behavior" supposed to be referring to? Subjective experiences -- a.k.a. qualia?]
No, just the information processing that the brain does as abstracted away from the neurochemistry of the brain.

You're ****ing with me, right?
Nope. Daniel Dennet is.

So in your view, subjective experiences have little relevance to whats actually happening in the brain?
I never said that. They are, of course, intimately correlated. What I have said is that there are reasons to think that we can, in principle, build something that has subjective experiences without having to simulate, emulate, or create a biological brain.

If thats your honest opinion you really -are- daft. I can see now this "discussion" has been an exercise in futility. If I'd realized before now that you're a complete fool I wouldn't have wasted my time...

I sincerely hope for your sake that you're not as weak-minded IRL.
Yes, it seems that anyone who disagrees with you is a fool. You had to say so 5 times. I am sure everyone got your point.
 
Irrelevant. We are both conscious within our conception of wallclock time, and in the conception of other beings in our specific-to-us, albeit different, reference frames.

wtf?

You just asked
For example, why would a human agree that computations that took a computer 10 years rather than 10 milliseconds for a brain is conscious?
and you are saying time diliation is irrelevant?

If you are on a starship, and it takes your brain 10 years to do the same thing it takes my brain 10 milliseconds, why would I agree that you are conscious? Hmmm?
 
Consciousness is no more a product of brain "architecture" than photosynthesis is a product of canopy "architecture".

Why not?

The "architecture" of the brain does not change between states of consciousness and unconsciousness, nor does it cease to process information during states of unconsciousness.

How are they "good definitions" if they completely ignore -the- essential feature of consciousness [i.e. subjective experience]?
They do not, they just thing that subjective experience is also a behavior.

A "behavior" that has yet to be physically identified or understood.

By those "good definitions", you would be considered unconscious if you were completely paralyzed.
Yep. False positives and false negatives suck, and "perfect" definitions and tests do not exist.

Hence the abysmal inadequacy of both your "model" and your criteria.

I reckon so. Fools tend to suck at philosophy anyway.

I don't share your opinions, therefore I am a fool?

No. You're just a fool. The difference of opinion is incidental.

For instance, they confuse facts they don't wish to accept with "opinion".

I do not blindly accept your pronouncements as facts, therefore I am doubly a fool?

The fact that it has to be painstakingly spelled out to you why it is necessary to understand the basic principles of subjective experience before engineering it demonstrates to me that, despite all your education, you are a fool.

Right. They're relevant at the level of what the brain actually does but not at the level of your abstractions.

No, just that there is a difference between neurochemistry and information processing.

Just as there is a difference between consciousness and information processing; the former is a concrete physical phenomena and the latter is a functional abstraction.

You're ****ing with me, right?
Nope. Daniel Dennet is.

I see you compensate for your personal lack of insight by letting others do your philosophical thinking for you.

So in your view, subjective experiences have little relevance to whats actually happening in the brain?

I never said that. They are, of course, intimately correlated. What I have said is that there are reasons to think that we can, in principle, build something that has subjective experiences without having to simulate, emulate, or create a biological brain.

You just said that qualia have little relevance to what the brain actually does. "Qualia" is simply a term for subjective experiences. Until they are physically identified and scientifically understood we cannot design consciousness into any artificial system.

If thats your honest opinion you really -are- daft. I can see now this "discussion" has been an exercise in futility. If I'd realized before now that you're a complete fool I wouldn't have wasted my time...

I sincerely hope for your sake that you're not as weak-minded IRL.

Yes, it seems that anyone who disagrees with you is a fool. You had to say so 5 times. I am sure everyone got your point.

You fervently believe that one can systematically engineer a feature that they cannot physically identify; that is foolish. You've stated that subjective experience is irrelevant to consciousness and that you "only focus on behavior"; that is also foolish. You willfully accept a theory of consciousness that has no explanatory power with criteria that cannot even identify it; this is unspeakably foolish. That you would stubbornly attempt to argue from such a profoundly weak position demonstrates to me that you are indeed a fool.
 
Last edited:
Shouldn't the sentence be 'Considering something performing an algorithm which will eventually produce a correct result as (analogous to) the brain is entirely inapplicable.'?

That's the real time problem. For example, why would a human agree that computations that took a computer 10 years rather than 10 milliseconds for a brain is conscious?


I'm afraid the time scale is irrelevant. It is the integration of information that is at issue with time dependence; and that problem begins at the level of the neuron whatever the time scale not at the level of the brain as a whole. We could theoretically speed up or slow down the world so that it took place at any particular speed and see the same thing as we do now provided information integrated in the same way. This is not an issue that only involves brains. It involves neurons. Anyone who does not see why needs to spend more time thinking about how neurons work in the first place.
 
As westprog pointed out, introspection is how we know we are conscious and the contents of our consciousness. Without introspection we wouldn't even have self-reports. Introspection just means looking "inward".

I didn't say introspection doesn't exist. I said it isn't reliable.

In a sense, consciousness is a "thing". We don't just have consciousness, or do consciousness; we are consciousness.

That is your opinion. Mine is that we "do". It seems much more likely, to me, anyway, that consciousness is the result of processes in the [human] body, than it's some as-of-yet undetected "thing" inside the brain.

Absent consciousness, we do not exist as subjects.

Philosophically, or grammatically ?

Regardless of whether one wants to think of consciousness as a "thing" or a "processes" we know for certain that it is physically salient and real.

Yes. But is it as real as legs ? Or as real as "running" ?

Keep in mind that atoms are themselves physical processes

I've already adressed this.

Also, we each directly experience our own minds from the "inside" so we can atleast study the subjective aspects of consciousness; with atoms we didn't have the luxury of being able to study them before their scientific discovery. As we're already intimately familiar with the internal subjective dimension of consciousness

We think we do. But like other things introspective I suspect we'll be surprised by the answer.

As of now, all science has to go on are general behaviors and functions associated with consciousness, but we've yet to objectively pin down the physical thing in itself.

That may be because you're assuming there is such a thing. If it's a process one could not identify the "thing", only the "thing doing".
 
The movement of an atom is just the uncertainty of it's position, and vis versa. There is no real distinction between an atom and the movement of an atom. Atoms, and all other physical objects, are essentially fluctuating waves of potentiality; i.e. energy.

Really ? So a proton moving close to c isn't really moving ? We just don't exactly where it is ? It really didn't travel from a supernova to us ?
 
Irrelevant. We are both conscious within our conception of wallclock time, and in the conception of other beings in our specific-to-us, albeit different, reference frames.

Not really, most people assume they meet the criteria of a poorly defined term conscious. Now if we stick to the medical defintion it gets easier.
 
I'm afraid the time scale is irrelevant. It is the integration of information that is at issue with time dependence; and that problem begins at the level of the neuron whatever the time scale not at the level of the brain as a whole. We could theoretically speed up or slow down the world so that it took place at any particular speed and see the same thing as we do now provided information integrated in the same way. This is not an issue that only involves brains. It involves neurons. Anyone who does not see why needs to spend more time thinking about how neurons work in the first place.
Why yes. Under special relativity what is perceived by entities in a specific reference frame is that their wallclock keeps time as it always has.

For entities 'at rest' with respect to that reference frame, the moving-frame wallclock is different, but their personal wallclocks keep time as always.

If one wishes to find obfuscation, when shall we consider what physics is currently missing -- general relativity and quantum mechanics do not agree. Under which set of formulas is consciousness computable?

Of course neurons are needed for the brain to "compute" irregardless of reference frame. My initial comment '10 years' vs '10 milliseconds' assumes the same reference frame, and for entities (us) that manage in 10 msecs, the 10 year entity fails to demonstrate consciousness in any meaningful to us use of the word.
 

Back
Top Bottom