A priori synthetic statements

David said:
None of those are descriptions of blueness. Mary in the black and white room cannot be given any of your descriptions and know what blueness is like.
But, in principle, she can be operated upon and then know what blueness is like. Her lack of knowing blueness is nothing more than a lack of certain physical memories in her brain.

Conversely, Mary can be given a description of a physical process, and know everything there is to know about that description.
But she still won't have experienced the physical process. So if the process is, say, jumping over a stool, and she has never jumped over a stool, she won't have the entire set of associated memories. If the process is something like making gasoline from crude oil, where there is no psychological aspect to the process, then she cannot know what it is like to be gasoline made from crude oil one way or the other.

Even though we use the same word (knowledge) for each scenario, the former scenario is different to the latter for the simple reason that blueness cannot be described to Mary.
It is different, yes, but only because some experiences involve a psychological aspect and some do not. If you want to think that somehow makes consciousness fundamentally different, be my guest.

If we cannot describe to Mary what blueness is in the same way that we can describe every conceivable physical process, then blueness cannot be physical.
Of course it can. The problem is that describing the process of seeing blue is not the same thing as seeing blue. The latter forms some memories in the brain that cannot be formed merely by reading a description.

If blueness cannot be described to Mary in terms of physical relationships then how can you say it affects the brain?
Blueness can be described in terms of physical relationships, but reading that description does not form all of the relationships so described.

In order for something to affect the brain it must be initially defined in a physical way.
I'm not sure what you mean.

The problem with the Mary thought experiment is that it ignores the fact that not all memories are formed by intellectualizing descriptions of processes.

~~ Paul
 
Paul C. Anagnostopoulos said:

The problem with the Mary thought experiment is that it ignores the fact that not all memories are formed by intellectualizing descriptions of processes.
~~ Paul
And the problem with a the materialist wordview is that it sees no distinction between memories imprinted by intellectualizing vs directly by sensory i/o.

What intellectualizes? Er, yes, I know -- an epiphenomena. ;)
 
Hammegk said:
And the problem with a the materialist wordview is that it sees no distinction between memories imprinted by intellectualizing vs directly by sensory i/o.
You mean the distinction can enlighten us about why the dreaded materialism is wrong? How so?

~~ Paul

Edited to add: Pfffft! Silly me. Of course, intellectualizing cannot be a product of brain function.

~~ Paul
 
Paul C. Anagnostopoulos said:
Pfffft! Silly me. Of course, intellectualizing cannot be a product of brain function.
Sure it is, for you. :D

Now leap the causally efficaceous gap -- or admit you are a Turing machine hooked up to some i/o sensors/servos. And I admit that perhaps you are, but *I* will never "know".
 
Hammegk,

And the problem with a the materialist wordview is that it sees no distinction between memories imprinted by intellectualizing vs directly by sensory i/o.
This is simply false. Nothing about any form of materialism suggests that there is no distinction, and neuroscience most definitely recognizes that there is a distinction. There are many different types of memories, and many different way which they form.

What intellectualizes? Er, yes, I know -- an epiphenomena.
No, the brain intellectualizes. Even epiphenomenalists recognize that, which is why epiphenomenalism falls apart logically. Since the brain is what thinks, remembers, and ultimately knows things, the notion that we actually know that we possess supposedly causally inefficacious properties, is simply incoherent.

Now leap the causally efficaceous gap -- or admit you are a Turing machine hooked up to some i/o sensors/servos. And I admit that perhaps you are, but *I* will never "know".
The idea that you may be a biological machine scares the crap out of you, doesn't it?

I suppose it makes you feel better to pretend that since you can never know with absolute certainty that you are, that you somehow have good reason to believe that you are not. The fact is that the evidence pretty clearly indicates that you are. But I suspect that as long as you can dream up metaphysical scenarios by which you would not be, you will continue to cling to the beleif that you are not.


Dr. Stupid
 
Stimpson J. Cat said:

The idea that you may be a biological machine scares the crap out of you, doesn't it?
Should I find a logical position that suggested such to be the case, why would I not accept it? Yet, I do not find that facts, and logic, dictate that we are bio-machines.

BTW, do you accept that *you* are indeed such a machine?
 
hammegk,

BTW, do you accept that *you* are indeed such a machine?
I don't know what you mean by "*you*", or how it differs from just "you", but I certainly accept that I am a biological machine.

Dr. Stupid
 
Yup, we agree you and I are bio-machines. *You* and *I* are the reason *we* cannot be replaced by Turing machines and some i/o devices.
 
Yup, we agree you and I are bio-machines. *You* and *I* are the reason *we* cannot be replaced by Turing machines and some i/o devices.
I don't suppose you would care to explain what these terms "*you*" and "*I*" are referring to? Or why you do not think that they can be modeled as Turing machines.

By the way, not to be over-picky, but Turing machines are just models. Any truly deterministic machine with a finite number of degrees of freedom could be considered a Turing machine, but such things do not exist in nature. Plus, it is unnecessary to say "Turing machines and some i/o devices", since i/o devices can also be modeled as Turing machines.


Dr. Stupid
 
Picky, indeed, but I stated "replaced by", not "modelled by"; hence the need for i/o. I agree that "Any truly deterministic machine with a finite number of degrees of freedom" is what I actually have in mind.

Mr. Data can model human behavior; you say that makes it "alive" and "human", capable (theoretically) of replacing *you*. That's where we disagree.
 
Hammegk,

Picky, indeed, but I stated "replaced by", not "modelled by"; hence the need for i/o. I agree that "Any truly deterministic machine with a finite number of degrees of freedom" is what I actually have in mind.
Well, biological machines are not Turing machines, so I don't see your point.

Mr. Data can model human behavior; you say that makes it "alive" and "human", capable (theoretically) of replacing *you*. That's where we disagree.
Mr. Data is not "alive", nor is he "human", nor does he even do a very good job of modeling human behavior. But this is completely beside the point since Mr. Data is also not a biological machine.

Anyway, I don't see what any of this has to do with consciousness. And you still have not explained what you mean by "*you*", or how it differs from "you".


Dr. Stupid
 

Back
Top Bottom