nescafe
Caffeinated Beverage
- Joined
- Apr 25, 2006
- Messages
- 862
That is the point -- there is no reason at all to assume that consciousness is a specific physical thing.If my central point is that we are lacking the knowledge of the "specific physical thing" that is a sufficient indicator of consciousness why in blue barfing blazes would I then turn around and claim knowledge of what it is?
Why? Yes, the receptor sites have to have the same functional effect they currently do, but beyond that constraint we could (in theory) replace it with (say) pushrods, direct electrical connections, optical interconnects, whatever -- as long as the appropriate receptors excite or inhibit the target neuron to the same degree they currently do, nothing would have changed.The substitutes must be able to at least chemically mimic the "natural" signal molecules in order to produce similar effects.
Why? Why would mere functional equivalence not suffice?Whatever devices are used they must inhere the relevant physical properties that allow our biological neurons to produce consciousness.
If it was "common flipping sense", then we would not be having this conversation.Being as how we do not know the physical "whats" and "hows" of consciousness we have no way of knowing what hardware systems would be sufficient beyond our own. Its common flipping sense, dude.
I simply think that talking about consciousness as if it were a special property of our neurochemistry of individual neurons missing the forest for the chloroplasts. I think that consciousness arises as a consequence of the overarching neural architecture of our brains, and that we should be able to replicate that on a substrate that is not based on our biochemistry.It's entirely unjustified to assume that merely emulating the computational functions of our neurons is sufficient to produce consciousness -- especially when we have not yet discovered what consciousness is or how it is produced in the first place.
Again, we do not know that consciousness has something that specifically constitutes it at a physical level. The only good definitions we have are essentially behavioral ones.Even assuming that we actual do learn what physically constitutes consciousness, simulating it would not reproduce it anymore than a computer simulation of gravity produces actual gravitational effects.
We do have at least a rudimentary understanding of consciousness as humans implement it. This intro to psych course provides a decent overview of how much we know as of late 2004. It does not get into philosophy, and I regard that as a Good Thing.You CANNOT engineer a feature into a system without having a rudimentary understanding of it or, at the very least, have the ability to physically identify it.
Because it is not a fact, it is your personal opinion.For the life of me, I don't get why you are so resistant to facing this fact.
Oh, they have relevance, but at the level of describing the mechanics of how brains work, not at the level of describing the abstract behaviors those mechanics implement.Right. So basically you're saying that the chemical properties of neurotransmitters, and the physical conditions of biological brains have absolutely no relevance to the production of sensations, emotions, or other subjective states.
Actually, I think that qualia is a useless philosophical term of art that has little relevance to what is actually happening in the brain.You can't even address the basics of what qualia are, or how they are produced, yet you insist that creating them is a simple matter of "engineering".
No-one. I am participating in this conversation primarily for my own amusement.Just who do you think you're kidding?
Good, then you don't have a problem with replacing them with some other mechanism, or abstracting them away entirely.I never said they are the only possible ones.
I think that describing consciousness in terms of synaptic activity would be an epic waste of time, for the same reason that describing Windows 7 in terms of circuit layouts would be.The point is to find out WHAT consciousness is and HOW that "synaptic activity" produces it in the first place.
Not until you can explain why a sufficiently detailed simulation of a conscious system would not be conscious without resorting to your usual red herring objections. After all, a simulation of an information processing system still processes information.Oh my god! Could you -please- spare me the constant appeals to "but we can simulate it!".
As are claims that it must have something special to do with the details of our neurochemistry.First of all, we do not know what consciousness is to begin with, so claims of knowledge of how to simulate it are complete ****ing rubbish.
In principal, yes. In the real world when dealing with specialized information processing systems, not so much.Second of all, even if we did have the knowledge required to design such a simulation, simulation itself is, in principle, NOT a reproduction.
The reason I think simulation will be sufficient is because the primary output of conscious systems is behavior that other conscious systems recognize as conscious.
If by "All that" you mean "One way". I think that brute-force simulation at a neuron-by-neuron will be useful primarily as a stepping stone, if we have to do it at all.Your entire position basically boils down to: "Brains compute, therefore consciousness is a prior computation. All that is needed to produce consciousness is to emulate the computations of the brain and call it a day."
No, I just to not assign the same significance that you do to that low-level phenomena.Yet, in the same breath, you'll handwave away the significance of those conditions or even the need to understand how they relate to consciousness.
I am not.Stop lying to yourself.
I do not think we will find the answers to subjective experiences at the level of the chemistry of physics of the brain. I think we will find it at the level of the information processing properties of the brain. Stop insisting that we live in a state of complete ignorance on the problem of subjective experience or that your approach is the only valid one.Its a flat fact that we do not know what consciousness is or understand how the chemistry/physics of the brain produces subjective experience.
And if that was exactly what I was claiming, your rancor might be justified. I claim that computation is necessary, just like chemistry and physics are. I claim that we have not found what the sufficient constraints on computation (or physics or chemistry) are.Your claim that computation is a sufficient explanation of consciousness is not only a -belief-, its a completely unjustified one at that.
My model cannot answer these questions, as it focuses on behaviour rather than trying to account for details of subjective experience.So, by your criteria, how would one go about discerning if a nematode has subjective experiences and if so, what the range of its experiences are, what it will experience given a particular stimulus, and what its experiencing during a given period of time?
We do not have enough information to formulate any sort of test for consciousness by your criteria, so why should I rely on any test created at our current level of knowledge that relies on anything but self reporting for subjective experiences, knowing that it can be fooled by a program that prints "I think therefore I am"? Behavioral tests are much more useful right now.If you cannot answer these questions you do not have a sufficient theory of consciousness, and all your handwaving bluster about computational criteria amounts to nothing more than a pile of empty platitudes.
Yup -- I focus on behaviors not "experiences", subjective or otherwise.Yet you can't tell me the first thing about what those allegedly conscious systems are experiencing or how your "criteria" even relate to those experiences. Get real.
Again, behavior, not subjective experience.In other words, when you were claiming that you knew the sufficient criteria for discerning consciousness you were just talking outta your behind.
Only on the Internet, apparently.You've gotta be kidding me. You have a theory of consciousness that explains nothing, criteria for discerning consciousness that can't even tell us if a nematode has subjective experiences and you seriously consider your "model" to be the epistemic equivalent of an internal combustion engine? Are you completely daft?