yy2bggggs
Master Poster
- Joined
- Oct 22, 2007
- Messages
- 2,435
Like what mommy and daddy did?The other is to assume that one can produce consciousness in some way without knowing exactly what it is or how it works.
Like what mommy and daddy did?The other is to assume that one can produce consciousness in some way without knowing exactly what it is or how it works.
You're killing your own argument. What you are calling a computer isn't all that a computer is.
And my point remains. We are computers. Everything you say about computers, if you just say "people" instead, is equally applicable. You're trying to show a difference by appealing to your own prejudices that we are in no way obliged to have ourselves.
Like what mommy and daddy did?
Let's go through this using a different approach.Yes, we can act as computers - but if we act purely as computers we don't thereby get understanding.
Depends, does the dye rely solely on mathmatical algorythim? If so then no....does rolling a figurative die count?
I could be wrong having missunderstood your point. I appologize.At what point do you think I claimed what you're rebutting?
I'm not sure what this has to do with the point at hand.If we observe a property in exactly one place, then that's what we should assume possesses the property.
??? That's what I said.Why do you assume that just because consciousness has not yet appeared except in humans that it can't be produced?
Based on what theory? And nonsense and not a premise held by scientists and experts in the field. You are entirely without foundation.The other is to assume that one can produce consciousness in some way without knowing exactly what it is or how it works.
The algorithm isn't in the internal workings of the die. The algorithm is to toss the die, and map its results onto the desired range.Depends, does the dye rely solely on mathmatical algorythim? If so then no.
We can be computers - but if that's all we are, we don't understand anything either.
You're killing your own argument. What you are calling a computer isn't all that a computer is.
And my point remains. We are computers. Everything you say about computers, if you just say "people" instead, is equally applicable. You're trying to show a difference by appealing to your own prejudices that we are in no way obliged to have ourselves.
{sorry, it's a minor point}No. The human brain is a computer. That's just what it is--it is a device that calculates. As a consequence of this, when westprog says that computers can't x, it's trivially false if x is something that our brain does. That's my first argument.yy2bggggs, you're not arguing that:
...
No. The human brain is a computer. That's just what it is--it is a device that calculates. As a consequence of this, when westprog says that computers can't x, it's trivially false if x is something that our brain does. That's my first argument.
Furthermore, westprog is accusing his opposition of overreaching in their metaphysics, and maybe they are. But he's not being very convincing at showing how not to overreach in your metaphysics when he asserts that computers can't do x. As such, he's killing his own argument.
His particular argument is that people are more than computers. My particular counter is that computers are more than what he calls computers. In order to show that my computer--which is also a transmitter, receiver, lamp, desk, electronic device, heater, etc--does not do what a brain--which is also a living organ, an electronic device, a heater, etc--does, then westprog needs to show that the living brain has something that my computer does not. Since westprog doesn't even know what the human brain has that allows it to do x, how can westprog claim that my computer can't do x?
Those are the two points I'm arguing. Note that I'm not specifically arguing that a human brain is a computer (except for a brief touch here)... I'd like for westprog to outright deny this before I make this point.
Sure, fine. So computers don't understand anything, and neither do we.Yes, we can act as computers - but if we act purely as computers we don't thereby get understanding.
Physical instantiations are not ideal computers (the word "pure" is meaningless here). They have finite limits and are subject to error.Nothing in real life is a pure computer, of course. It's an abstract concept.
A "private behavior" would just refer to a cognitive process that is not outwardly communicated. An "experience" would refer to the actual sensations associated with those cognitive processes.
So "qualia" are "qualitative experience" are "subjective experience" are "qualia".
How do we know they exist, again ?
The same way that we know anything else exists; we experience it in some capacity.
Again, the terms are just labels we put on the actual experiences. Scientifically understanding the experience is what we should be aiming for.
Strawman.
Excuse me.... WHO said ANYTHING about "things we cannot detect"? This is a big problem in this forum, as I have always stated.
So which is it that distinguishes our computers from our brains? Are they insufficiently complex, or merely insufficiently error-prone?
My model of the mind is that certain cognitive processes cause certain behaviors, but I say nothing about experience itself because assertions of that nature would contribute nothing to what we can observe experimentally, so I consider the nature of experience to be unknowable.
Well, that's the HPC for you.
Yes, we can act as computers - but if we act purely as computers we don't thereby get understanding.
Not really. An experience refers to the notion of a sensation such as the "redness" of red or the "blueness" of blue. A private behavior would merely be a cognitive process that would have the potential of triggering a public behavior like someone saying, "I see the color red." The presence of an experience isn't required to explain the latter, and as we can only know of the latter phenomenon scientifically, we run into a quagmire where there is no accounting for the phenomenon of experience. Even the idea that there exist sensations in the first place can be understood in terms of cognitive processes, and that could open the door up to the possibility that sensations are a nonsense concept invented by the brain to help it organize itself, and if that case were true, it would render consciousness an illusion altogether, so you can't just decide experience and cognition are one in the same because you feel like it and it makes thinking about the epistemological difficulties involved in the situation at hand easier. It wouldn't make sense to equate something real with something imaginary.That's the point we're making. An "experience" IS a private behaviour.
No. Brains are computers, so if brains do things, computers do them (e.g., brains do them). There are some things that brains do that silicon-based IBM PC compatibles do not do--for example, metabolize glucose. But there's nothing that brains can do that computers can't do, because brains are computers.And I take it you're arguing in principle: brain => computer. (Right now, brains do many things computers don't.)
Not just like logic switches in a computer. Neurons are logical switches.And since the brain is a massively parallel architecture of neurons, and neurons work in principle like logic switches in a computer,
...not exactly. Anything a brain can do a computer can too, in practice, because brains are computers.anything a brain can do a computer can too, in principle.