Frank Newgent
Philosopher
- Joined
- Sep 4, 2002
- Messages
- 7,535
In its simplest form, yes. But not at the same level as a human, obviously.
In the same way that a running stove is "hot", but not quite like the sun.
Would the sun consider the stove hot?
In its simplest form, yes. But not at the same level as a human, obviously.
In the same way that a running stove is "hot", but not quite like the sun.
Ah, yes the argument-from-it-hasn't-been-done-yet-so-it-can't-be-done-ever.
I take it, then, that since you have fallen back on this stupid fallacy (you aren't the first, don't worry) you finally concede my point. Thank you.
Again, I don't think this is called for.
How do you know if describing the behaviour of the system when pain is felt doesn't describe pain ?
The strong interpretation of Goethe's Metamorphosis of Plants demonstrates that that the technique of imaginative thinking as a form of introspection can be used to describe objective reality and hypothesize the homologous structures of plant organs. This same technique can be used to study consciousness.PixyMisa said:What, exactly, does the strong interpretation - or any interpretation - of Goethe's Metamorphosis of Plants have to do with the subject?I will take a stab.
Using the strong interpretations of Goethe's Metamorphosis of Plants as a starting point.
http://hps.elte.hu/~zemplen/goethemorph.html
PixyMisa said:What evidence is there that the strong interpretation - or any interpretation - of Goethe's Metamorphosis of Plants is in any way valid?
PixyMisa said:It is true that our understanding of how things come about has much to do with what those things do. Unfortunately, this is in no way a substantive response to the question; it's little more than hand-waving.By attributing the same lawfulness to the outer world as to the inner and training the faculty for comprehending this relationship we can come to the conclusion that an understanding of how consciousness comes about has as much to do with its effects as to its reason.
It requires imagination to predict the contingent results of consciousness.PixyMisa said:How does this follow from the previous statement?Thus the imaginative study of human endeavors will reveal additional requirements for consciousness in addition to computation/reason.
You missed the point.PixyMisa said:What evidence do you have that song writing requires more than computation, particularly when we already have song-writing computers?I suggest beginning with song writing
PixyMisa said:What experience? Experience of what? What evidence and reasoned argument can you present to back up this experience?Experience
Any process that is not understood at the level of the physics is not fully understood. That goes for all biological processes. That we understand respiration in a biological sense in no way mitigates against understanding it in a physical sense.
Is there any biological process where the physics is not relevant?
As I don't like having to repeat myself, please read post 1162.
Thats a bit unfortunate. If the discussion is boring you to the point where you're not motivated to follow it then perhaps its a sign you should be spending your time doing something more stimulating![]()
This same technique can indeed be used to study consciousness. However, it gives answers that are now well established to be wrong.The strong interpretation of Goethe's Metamorphosis of Plants demonstrates that that the technique of imaginative thinking as a form of introspection can be used to describe objective reality and hypothesize the homologous structures of plant organs. This same technique can be used to study consciousness.
Evidence?You asked Aku what else might be necessary for consciousness apart from computation. I have replied that consciousness is not just understood as a result of reasons, but the reason for results. In other words consciousness not only has necessary reasons, but also contingent results.
Evidence?It requires imagination to predict the contingent results of consciousness.
In computers?You missed the point.
Song writing develops the imagination.
So your consciousness would be simultaneously internal and external, which demonstrates that it is purely arbitrary distinction.
Yeah but you aren't accounting for the fact that certain individuals have a vested interest in woo.
The only way to maintain such a view, in the face of contrary evidence, is to demand a pefect model and yet more evidence.
AkuManiMani said:Now use that knowledge design a device that experiences the sensation of "cold" when its at room temperature or above.
Ah, yes the argument-from-it-hasn't-been-done-yet-so-it-can't-be-done-ever.
I take it, then, that since you have fallen back on this stupid fallacy (you aren't the first, don't worry) you finally concede my point. Thank you.
AkuManiMani said:As I don't like having to repeat myself, please read post 1162.
Yeah, read it. It's the same argument that I've already addressed, so I'm not sure why you are repeating it.
I suppose only you know why you keep repeating the same argument when it has already been addressed, so if you don't want to repeat it why not try understanding what others are telling you?
That information processing -- meaning just any old sort -- is not sufficient to be consciousness, it does not follow that it is not possible for information processing of some type to be consciousness. A critical issue concerns what type of information processing is under discussion.
[bolding added]
Sure, lots of examples of information processing are not consciousness. I don't anyone who says that just any old type of IP is consciousness. What we need to understand is the structure of the IP that is consciousness. All the evidence to date suggests that brains process information and there is no significant evidence that other cellular features of brains aside from the fact that they consist of excitable cells with semipermeable membranes is important to the process that we call consciousness (or any other information processing carried out by the brain).
Computers are capable of duplicating those IP features of brains, so it is reasonable to believe that computers can do the same thing.
Your argument is simply a waste of time. Why do you keep repeating it?
The fact that you've even made this statement demonstrates that you don't understand what I'm arguing. That consciousness carries out a KIND OF information processing is [forgive the pun] a no brainer. What distinguishes conscious information processing from unconscious processing [in our own brains, mind you] is the physical context of that processing.
Those cells still have the same IP features during conscious and unconscious states. The brain does not shut off or cease processing information when the subject is in deep sleep, or otherwise unconscious. What differs between varying states of consciousness and unconsciousness is the energetic state of the brain; Consciousness is a biophysics problem, first and foremost. Trying to re-create consciousness in artificial systems while ignoring the relevant physical states that are correlated with -known- examples of consciousness in actual living brains is -- quite frankly -- unspeakably asinine.
And computers are capable of duplicating IP features of chemical combustion and fission; that doesn't mean that their processing will produce actual fire or nuclear reactions.
Because the nature of your attempts at rebuttal indicate that, deliberately or inadvertently, you're misunderstanding what it is I'm actually arguing.
Would the sun consider the stove hot?
Because no matter how detailed, no description of the processes involved in producing pain give any idea of what it feels like to have a pain.
RD, I'm fairly certain that you're a lot sharper than that; you're deliberately being obtuse. You know good & well that I'm not arguing that it can't ever be done.
The fact of the matter is that it can't be done with the level of understanding we have at our disposal now, and its most certainly not possible with the descriptive level in your last response. Theres a term for a systematic failure to recognize one's limitations: its called arrogance.
Again, I don't think this is called for.
How do you know if describing the behaviour of the system when pain is felt doesn't describe pain ?
Of course not. But how does that help you ? No one is claiming that.
What I'm saying is that "pain" may be nothing else than the behaviour behind pain itself, that is: the whole process of stimuli-to-reaction. The added layer of "qualia" or whatever, doesn't really explain anything.
You're reporting your "experience of pain" to be something entirely different, and so is UE. But to me the experience IS the pain, and so they are one and the same. As opposed to what UE seems to be thinking, I'm not being argumentative about it. It's my honest opinion.
Don't you see that any description of the behaviour is entirely different to a description of the experience - and that it is pretty much impossible to describe the experience of pain except with the assumption that the person to whom you're describing it has had some similar experience?
And how is this different from most behaviours humans exhibit such as say running - or are you saying there really is a "hard problem of running" and every other behaviour we engage in?