The relationship between science and materialism

Paul,

Another way of explaining it.

Language is naturally split into subjective things and objective things, into mental and physical.
What do you mean by "naturally" split? Allow me to argue that this split is completely unnatural.
If we allow this dualistic language when defining our terms for the purposes of these arguments then, as we have seen, this leads to logical problems.
Agreed.
There are two strategies currently being employed to get out of the logical problem. The first strategy is eliminativism, which effectively chops off the subjective half of language, and all the terms which naturally belong in that half. This works logically, but involves the outright denial of the existence of mind and claims half our language is excess to requirements.
John B. Watson attempted this, and it fails. If you are attempting to call modern behaviorists "eliminativists", though, then you are misusing the word. Note that eliminativists are tacitly speaking of a dualistic world, part of which they ignore. This is folly, of course.
The second strategy is to tell a fancy story about how "minds ARE brain processes" and hope nobody notices that the the "ARE" doesn't actually mean anything at all.
"The" second strategy? "A" second strategy, perhaps, and perhaps not even that. There are other options.
So the problem with the second strategy is we have an unexplained and apparent inexplicable "IS". Now - think back to my system. What's in it? We don't have any mind or matter.
But you are still phrasing it in the terms used by dualists. You are just bending over backwards to deny that you are doing so.
But what we do have is something called "Being". "Being" is another form of "is" - they are both parts of the verb "to be". So in actual fact the "extra thing" in my system that you might say isn't needed is another manifestation of the meaningless "IS" which invoked by the non-eliminative materialists. But there is a difference. Their "is" doesn't mean anything and mine does.
The very assertion that their IS is meaningless implies that you are still seeing some dualistic difference between brain stuff and mind stuff. Otherwise, why do you find their IS problematic, while your IS solves everything? (for the record, my own view is that the "minds ARE brain processes" statement is simply a poor phrasing of the problem, and when problems are phrased poorly, we get poor answers. But if tied on a railroad track and forced to agree or disagree with it, I would easily disagree with it. Congratulations, someone who has been labeled an eliminativist has now disagreed with both of your strategies.)
Theirs is in the wrong place and mine is in the right place. All I am doing is re-arranging the components of the system so things have the correct relationships with each other instead of the incorrect ones.
Re-arranging the deck chairs on the Titanic is not gonna stop it from going down.
 
**** Computers do not have minds ****

Can you really show that what brains do isn't a data manipulation of a form that is totally analogous to what a computer does?

Can you answer this without an argument from ignorance? (That is to say if you tell us we understand how a computer computes but not a human brain again I will have to virtually slap you).
 
**** Computers do not have minds ****
Fine. Let's accept that. But computers do embody computation. Computation is not a physical thing. Whilst it requires a physical computer of some sort to run on, computation does not in itself have physical properties. Not only that but a wide variety of physical things can implement computation and any one of them can, for example, calculate prime numbers, play chess, simulate aircraft in flight and a whole range of things that we wouldn't imagine just from looking at their physical structure. Just as we wouldn't imagine a physical brain could be conscious. Computation is a "thing" in it's own right, but not a physical thing.

Now we could go further and say that consciousness is just computation and some do. But we don't have to go that far. Let's just say that like computation, consciousness is a thing in it's own right, but not a physical thing, even though it is entirely dependent on the physical world for its existence. Computation is the action of a computer when it is operating. But computation is not the same thing as the computer that implements it. Likewise consciousness is brain processes but is not the same as the physical brain. The two "is"'s in italics above are equivalent.

We don't feel the need have a separate entry in our ontology for computing, given that it is so obviously dependent on the physical world for it's existence. We don't worry that this is computational eliminativism.
 
I'm happy to say you're describing a monism as long as this Being thing doesn't actually serve any function.

Good. That is precisely why I have limited what I have said about the properties of Being. You can quite easily turn my position into that of either Sartre or of Heidegger. They both have extensive theories about "whether this Being thing serves any function". Ultimately they differ along the general lines of theism/atheism or naturalism/non-naturalism. So right now we are in a neutral position regarding the nature of Being with respect to "what it does". It could be defined as a merely passive part of the system, ending up with a position which was essentially naturalistic and atheistic (in a Nietschze/Sartre sort of way), or it could be defined as an active part of the system, in which case you end up with a potentially theistic and supernaturalistic (in the philosophical sense of "supernatural") Being. This system does not specify which way we go on this. It allows this decision to be left unmade or to be made by other methods, potentially including science, other branches of philosophy, personal experience and religion. So, from your point of view, you can point to the non-winning of Randi's prize and infer that Being "doesn't serve any [active] function". In this way it truly is science that is being employed in making the critical decision, not an unacknowledged metaphysical allegiance, because this metaphysical system, unlike materialism, is not explicitly naturalistic. Neither is it explicitly supernaturalistic.
 
But you are still phrasing it in the terms used by dualists. You are just bending over backwards to deny that you are doing so.

That's because there aren't any other words for me to use, Merc.

The very assertion that their IS is meaningless implies that you are still seeing some dualistic difference between brain stuff and mind stuff. Otherwise, why do you find their IS problematic, while your IS solves everything?

My "IS" doesn't serve the same function. There is no dualism in my system. The dualism exists in the world of experience and the world of language. So I can still use that dualism when I am using language to describe our everyday experience of the world. What REALLY exists is the "noumenal" neutral entity - not mind or matter.

Their "IS" is problematic because of the way they have defined "matter" to be the "external world". It is THIS which sets up the dualism, not anything that I have done. But my position is immune to this problem for the following reason:

I haven't got the materialist's concept of matter.

So not only is there no "mind-stuff", there isn't any "material stuff" either. No dualism. No implied dualism. In other words the problem isn't ME unwittingly being dualistic - it is the materialists themselves. That is why Dennett calls some of them "Cartesian Materialists".
 
Last edited:
Are you disagreeing that there may be something it is like to be a computer?

~~ Paul
Which reminds me of something I found the other day, that I thought was amusing....

There is something it is like.

This innocent-looking little phrase, which I believe comes from Nagel's famous paper 'What is it like to be a bat?' (summary of the paper - 'how would I know?') has played a quite astonishing role in raising the morale of the lovers of qualia. Whenever they're on the brink of throwing in the towel and admitting that it's been confused nonsense all along, they repeat the mantra and everyone brightens up again. The idea is that when you see something red, it isn't just a matter of acquiring some information about the light hitting your eye: there is something it is like to see the colour red.

To me, this is about as sensible as trying to include carnal knowledge in epistemology, or debating the ontology of the 'it' that does the raining. (Come to think of it, some idle philosopher has probably done that last one). When we talk about a thing being like something, that's what we mean - it's like something else. If I eat ostrich, and someone asks me what it's like, I don't screw up my face and say 'Uh, well I can't tell you, but there was an ineffable experience which it was like'. I say 'A bit like beef, with a slightly less granular texture.'

So, if there is something it is like to see red, what is it? Seeing puce?
http://www.consciousentities.com/
 
Are you disagreeing that there may be something it is like to be a computer?

~~ Paul

It amazes me how incongruent your arguments are.

Asking this question just shows that you have no idea about what you´re really defending. Why would a physical object have an non physical experience?.
 
Yes, I love the phrase "something it is like," too. Dan Dennett's a real fan of it. :D But I don't know how to ask the question any other way. Then again, maybe the question makes no sense.

Do you think a computer has any sort of subjective experience vaguely like our own?

~~ Paul
 
Mary said:
It amazes me how incongruent your arguments are.

Asking this question just shows that you have no idea about what you´re really defending. Why would a physical object have an non physical experience?
Who said anything about it being nonphysical? Are we required to load "something it is like" with dualistic notions, too?

And, by the way, what is it I'm defending?

~~ Paul
 
You know, it's interesting. Apparently certain terms, when used in a philosophical conversation, are assumed to be heavily laden with all sorts of notions, mainly dualistic ones. But if you're not a professional philosopher, you don't know this. So people interpret what you say in various ways that have nothing to do with what you said. This can carry on for quite some time before you realize that you've been completely misunderstood. It's rather annoying, really.

~~ Paul
 
Fine. Let's accept that. But computers do embody computation. Computation is not a physical thing.

No. It is a process taking place in a physical thing. That process has entirely physically-describable properties. It has no mental properties. It does not resemble a mind in any way, shape or form. What is DOES resemble are physical processes taking place in a brain. It resembles them because both brains and computers are physical objects which, in some way, process information.

Whilst it requires a physical computer of some sort to run on, computation does not in itself have physical properties.

Yes it does. There are no properties of the process going on in a computer that cannot be described physically. None. What sort of properties do you think it has if they are not physical? Mental properties? No. We've already agreed that computers don't have minds, so they aren't mental properties. All of it's properties are properties of a physical process. Period.

Not only that but a wide variety of physical things can implement computation and any one of them can, for example, calculate prime numbers, play chess, simulate aircraft in flight and a whole range of things that we wouldn't imagine just from looking at their physical structure.

We could imagine it if we looked at the structure of the physical process.

Just as we wouldn't imagine a physical brain could be conscious. Computation is a "thing" in it's own right, but not a physical thing.

I just explained why that's not quite right.

Now we could go further and say that consciousness is just computation and some do. But we don't have to go that far.

Best not to. That's where all the problems start.

Let's just say that like computation, consciousness is a thing in it's own right, but not a physical thing, even though it is entirely dependent on the physical world for its existence.

(emphasis mine)

We would have to be very careful about what we meant and understood by this. You see consciousness, unlike computations, has all sorts of properties that aren't properties of a physical process. They are properties of a mental process. Suddenly, all the words we need to describe the properties of consciousness are subjective, mental, 1st-person...you know. So, sure, we can say consciousness is a thing in its own right, and not a physical thing but then we run into problems. The problems are with the bolded words. I could agree to the following:

"Let's just say that like computation, consciousness is a thing in it's own right, but not a physical thing, even though it is dependent on the physical world for its content."

Now we have an accurate comparison between consciousness and computation. What we DO know about minds is that their content is tightly bound to the structure of physical brain processes. So, as scientists, rather than people who just assert that physicalism is true, we should only say what I said, and not what you said. Your claim goes over and above what is legitimised by science. It depends on an assumption that materialism is true. I think we may have been here before......

Likewise consciousness is brain processes but is not the same as the physical brain.

...and there's the meaningless "IS". :)

Chris, I am a software engineer and a philosophy student. I really do understand computationalism. I also understand exactly what is wrong with it. You are mixing up two things.

1) The relationship between brain and brain process.
2) The relationship between brain process and mind.

All of your detailed description in this post explained (1). It didn't explain anything else. It painstakingly pointed out that brain is not brain process. All well and good. Except that you then tried to claim that this was also an explanation of (2)! Look where you ended up:

Likewise consciousness is brain processes but is not the same as the physical brain.

Which relationship are you explaining here? You are quite explicitly explaining (1). However, you seem to believe that what you explained was (2). You set out to explain (2), but all you explained was (1). You then tried to fill in the missing piece with a meaningless "IS".

Do you understand computationalism now? It may help to re-read this post and try to figure out how you ended up explaining (1) when you thought you'd explained (2).

Geoff
 
Last edited:
Geoff said:
No. It is a process taking place in a physical thing. That process has entirely physically-describable properties. It has no mental properties. It does not resemble a mind in any way, shape or form. What is DOES resemble are physical processes taking place in a brain. It resembles them because both brains and computers are physical objects which, in some way, process information.

Yes it does. There are no properties of the process going on in a computer that cannot be described physically. None. What sort of properties do you think it has if they are not physical? Mental properties? No. We've already agreed that computers don't have minds, so they aren't mental properties. All of it's properties are properties of a physical process. Period.
Geoff, I have no idea what you're saying here. It sounds so heavily dualistic that I don't know where to start. You really need to move along with your description of neutral monism.

What does it mean for something to have a mind? How is it that people do but computers don't? How can we tell? What are these mental properties?

~~ Paul
 
Geoff, I have no idea what you're saying here. It sounds so heavily dualistic....

Let the dualism wash over you Paul. Bathe in it. Allow yourself the luxury of freeing yourself from being scared of using dualistic vocabulary. All we are doing is using normal words, and we know what they mean. Stop resisting the dualistic vocabulary and you will find it very easy to know what I am saying, because it makes absolute, crystal clear, perfect sense, so that anyone not embroiled in this debate would have no problem whatsoever with it.

that I don't know where to start. You really need to move along with your description of neutral monism.

You haven't responded to my post describing whether or not "Being has a function". Did you understand it?
 
Who said anything about it being nonphysical? Are we required to load "something it is like" with dualistic notions, too?

It is YOU who insists in using dualistic language. That´s why you are not consistent. Either you have your cake, or eat it.

In a physical world, where only physical things and processes exist, the experience "what is it like to be a computer" is nonsense.

And, by the way, what is it I'm defending?

Nonsense
 
In a physical world, where only physical things and processes exist, the experience "what is it like to be a computer" is nonsense.

Right, so from there you have only two options:

1) We don't have an experience of what it is like to be.
2) There's something nonphysical.
3) Ruling it out as nonsense is an arrogant position.

Woah! Wait, a third snuck in there.

It's inherently arrogant to argue from that position. I might as well take hammy's route and argue it's nonsense arguing philosophy with a bunch of p-zombies. You might not like it but hey, I'm the only one with a true and mysteriously produced conciousness so what do I care?

How about another option:

4) Stop assuming our perspectives are particularly special.
 
In a physical world, where only physical things and processes exist, the experience "what is it like to be a computer" is nonsense.

Yes. Correct for a computer, but not a human as I tried to point out. The difference arises from what we mean by the word "experience" which is commonly steeped in emotion and feeling. Current computers do not have the ability to feel emotion, so they do not experience as we do.

We do have several different feeling systems and emotion systems. In fact, they provide problems for the whole idea of an integrated consciousness because we may stimulate various areas of the brain to produce a raw emotion that the person feels is somehow not "his" or "hers" -- it doesn't quite fit with what is going on around or in them. The same thing happens occasionally with the auras of temporal lobe epilepsy, where to brain produces an emotion or a feeling that seems alien to the person experiencing it. Most of the time we seem to integrate this sort of thing into the ongoing story of our lives, but occaionally there is such a disjunction between the "intrusive" emotion or feeling and what is transpiring in the person's brain that no such easy melding is possible, so the experience is considered alien.
 
What does it mean for something to have a mind?

Well, it's got something to do with brain processes and something to do with Being, but the relationship is not "minds are brain processes". We can't define minds like that because we haven't got a definition of matter yet, and brains are material.

I need to resort to dualistic vocabulary in order to explain to you what it means for something to be "physical" and "mental" in my system. That means I have to switch perspective, and I don't want this to cause confusion. I've already said that what really exists is neutral and not physical. "Physical" and "mental" both belong to our world of experience - the world we use dualistic language to describe. This is going to sound like idealism to you - but it isn't. Remember I've already stated that what "really exists" is neither mental nor physical. But "mental" and "physical" are not part of the noumenal/neutral world. They are part of our world.


**********************************************
IMPORTANT: The descriptions which follow describe what mind and matter are with respect to each other, and from the perspective of the lifeworld. Only after these descriptions are complete is it possible to explain how to reduce "mind" and "matter" to a neutral entity. In other words, before you can reduce them, you have to properly explain what is to be reduced.
**********************************************

I'll start with "physical". Since "physical" now definately does not refer to "the mind-external reality", we are free to use it in the natural, dualistic-influenced way that we use it all the time. "Physical" is what the objects of our experience are. When you see a chair - that is a physical chair. But we aren't mixing up the physical chair you experience with any sort of "external physical chair" - because those external things are neutral, not physical. So my definition of physical now accords with the definition of physical given by idealism. It's different to the one given by dualism. And it's the same as one of the two different uses of "physical" employed by materialists. It's just no longer also the other of those two uses. In other words we have avoided the whole P1/P2 problem from before. However, we are still lacking the definition of "mind".

Mind is easy, though. Just as with physical, the definition of mind is the obvious one. It is the entire subjective frame of reference. It is the "frame" in which all of our experiences occur. Everything we ever directly know comes to us via it. This, again, sounds like idealism, but it differs in that just because consciousness presents itself to us as the frame for all our experiences, I am not concluding from this that everything which exists is mental. It just seems that way if you think about it in the right way. Just like it seems that everything is physical if you think about it the another kind of way.

I'll stop there for the moment. The next bit of the theory will explain how to map the mind and the matter onto Being and a neutral entity in such a way as to reduce both of them to the neutral system without eliminating either of them.

How is it that people do but computers don't?

I'll defer this question until I have finished the system.

What are these mental properties?

Let the dualist language in, Paul.....

I think you know perfectly well what mental properties are. So does everybody else.
 
Hey, look at the Wikipedia definition of eliminative materialism:

I'm perfectly happy to be called a moderate eliminativist. I don't see anything about denying subjective experience in there at all.
What does the "moderate" qualification bring to the table? Does that allow you sneak in a bit of cartesian dualism -- just to cover all bases?

Eliminative Materialists can be replaced exactly by a sufficiently powerful digital computer and some sophisticated sensors & servos for I/O.

I feel so much better now.
~~ Paul
Why?
 

Back
Top Bottom