rocketdodger
Philosopher
- Joined
- Jun 22, 2005
- Messages
- 6,946
how much is required? or how much takes place?
Nick
Required.
You really don't know anything about this topic, do you?
how much is required? or how much takes place?
Nick
AkuManiMani said:If there were a artificial construct created with operational complexity comparable to, or greater than, that of a human would you consider it to be have 'greater' value than the life of a human?
That depends.
AkuManiMani said:For example, would it be justified to kill, or otherwise harm, a human to prevent harm from being done to said construct merely on the basis of its complexity?
Merely on the basis of complexity? No. Do we take complexity into account in such decisions? Certainly.
AkuManiMani said:Really? I thought the capacity to experience suffering was the basis for giving an entity ethical consideration.
It's one consideration. But very simple systems have no capacity to experience suffering, so it's really just a loose measure of complexity.
[...]
AkuManiMani said:Ofcourse. Because a brick is not alive and, as far as can be told, bricks cannot experience anything, let alone suffering.
Right. They are insufficiently complex.
AkuManiMani said:The thing is, if one were to arrange bricks or their materials in a more 'complex' way this would not change.
Fallacy of division. Quite a staggering example, in fact, since you make the assertion not only for bricks but for their materials.
AkuManiMani said:Clearly there is more to ethical concerns than mere complexity.
Clearly you haven't thought this through.
Required.
You really don't know anything about this topic, do you?
Irrelevant, untrue, and a logical fallacy. Good work!
Why?
I thought this whole thread was concerned with the problem of establishing how our brains conceive of self, and here you are claiming to be able to measure it against its apparent manifestation, and expounding on its cultural implications... wouldn't it be wise first to establish how our brains typically conceive of self before trying to make such comparisons?...There is such a disparity between how our brains typically conceive of self and how they actually manifest self that human culture inevitably exists on a continual existential precipice.
No adaptation required. The reality of our computational nature has evolved as we evolved and our value systems are a product of that evolution. Perhaps you meant they will need to adapt to our realisation of our computational nature ?Value systems which reflect evolution-acquired biological needs will have to adapt to the reality of our computational nature.
If that is your point (and I think it's a good one), then why not just state it instead of arguing around a straw-man misrepresentation of the OP? If it's accidental, it's careless, if not, it's underhand.My point was that cognitive complexity ["complexity of consciousness" or w/e you wanna call it] is hardly relevant in ethical or moral considerations. What qualifies a subject as being worthy of moral consideration is whether or not it has the capacity to experience suffering -- or experience anything in a qualitative manner at all. If one wanted to argue for ethics on the basis of "complexity of consciousness" then the life of a person of average intelligence would easily trump that of someone who is mental handicapped.
So, if we consider your example of getting out of a chair, you're stating that the number of processing functions undertaken by the organism could not be reduced whilst maintaining parity on a behavioural level? There are no possible short cuts?
Nick
Can't you fake it with a common superclass (ellipsoid?) that only implemented common methods/functions...?(ETA: Also, keep in mind that I'm pretty fairly skilled at OO programming, which corrupts the brain--because it pushes me into a crippled condition of thought whereby I'm forced to admit such ludicrous things as that circles are not ellipses).
If that is your point (and I think it's a good one), then why not just state it instead of arguing around a straw-man misrepresentation of the OP? If it's accidental, it's careless, if not, it's underhand.
I thought this whole thread was concerned with the problem of establishing how our brains conceive of self, and here you are claiming to be able to measure it against its apparent manifestation, and expounding on its cultural implications... wouldn't it be wise first to establish how our brains typically conceive of self before trying to make such comparisons?
No adaptation required. The reality of our computational nature has evolved as we evolved and our value systems are a product of that evolution. Perhaps you meant they will need to adapt to our realisation of our computational nature ?
No, not at all.
I am saying that you can only reduce the number of functions so far before you start to cut in on what contributes to the conscious behavior.
For example, you could most likely safely discard most celluar control functions that don't impact immediate muscle or neuron responses. That gets rid of a ton right there.
But what you are necessarily left with is still orders of magnitude more than what a 386 can handle. I know this because procedural skeletal mesh control in games is actually one of my specialties.
Just handling the kinematics of a two joint limb doing something as simple as resting on an object is a pretty hefty amount of computation, and that is restricted to something like 30 frames per second anyway, not continuous time.
What you have to understand, however, is that the computing paradigm used on a 386 is very different than a biological neural network. You might be able to argue that if all the transistors in a 386 were used to construct a dedicated neural network then maybe such a thing could control a limb with as many joints as a human arm in a fairly human-like fashion. But that isn't what we have -- if you want to use a neural network you have to merely emulate it using the serial execution of an x86 processor, which results in alot of wasted computation. So regardless of the method used on a 386, it just doesn't have the raw FLOPS required.
Well, my comments weren't directly referring to the topic of the OP; I've pretty much had my say on that a few dozen pages back. I was indirectly addressing Pixy's position on what constitutes a sufficient description of consciousness. In his view, reflexive processing it not only a necessary requisite of consciousness it is consciousness. A simple feedback device [like a thermostat] is 'aware' by his definition; and another regulatory feedback system on top of an 'aware' system and -- presto -- its 'conscious'.
When it was pointed out to him by me and others that this definition doesn't address or explain qualia [i.e. subjective experiences], which are the hallmark of consciousness, he just responded with his stock error messages: "Wrong", "Nonsense", or everyone's personal favorite "Irrelevant". When pressed further on the issue he simply filibusters. Apparently, certain concepts do not compute with him and conscious experience is one of them. I'm simply trying another means of helping him understand what is being discussed here.
Can't you fake it with a common superclass (ellipsoid?) that only implemented common methods/functions...?
Sorry,a bit off topic![]()
Qualia is like mind, an incoherent word lacking precision.
Well, qualia aren't all that popular a phenomenom around the Strong AI scene. The basic position is that the quale is conceptually erroneous. It's a concept that reinforces a viewpoint that is invalid - it's not what happens.
I find the idea that qualia can simply be regarded as an absurdity. All the people making this argument presumably experience qualia - and yet they seem to regard them as some kind of guilty secret; a shameful betrayal of their materialist principles.
If there's no precise definition of qualia, then the first task should be to find such a definition.
What do you understand by the term "qualia." If it's "the bit left over which processing can't explain" then the computational theory of consciousness is inevitably going to be a struggle. Maybe it's something else.
Nick