• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

“Materialism” is an empty word. An incendiary article.

Well, you know your own subjective experiences exist. But with regards to others, no. The existence of anyone else's subjective experiences can only be inferred, and the content of those experiences will always be a mystery. There is no "qualia detector". There is no way for me to know what your experience of the color red is like, and vice-versa. For all we know, our subjective experiences of colors varies from person to person.



If your brain is similar to mine, I can infer you experience things roughly like I do (even if the content of your experiences is unknowable to anyone other than you), but I can't be sure. If you make a mechanical brain that is functionally identical to a human brain, it should experience things, but for all we know it might be a zombie. There's no way to test for the existence of experience.

hi,
I am just stating by belief that the word 'perceptions' exists and has a more coherent meaning than qualia.
:)

I was more confrontational at that time than I wish I had been.

You test through interviews, and I am an M-zombie.
 
Obviously, this thread has ranged quite widely, with interesting digressions here and there (aside from dull intrusions involving half-considered allegations about mere "semantic word games"). Thanks to David Mo for introducing us to this topic and taking the time to translate the gist of the argument.
 
My car can go from 0 to 60 MPH in 20 seconds so obviously at the end of one minute I'll be going 180 MPH and at the end of ten minutes 1,800 Mph.:jaw-dropp

I'm sorry, but are you suggesting that it is inconceivable that a computer could model the position and energy of each molecule in a snowball?

It is, of course, a merely hypothetical computer at this point, but so what?
 
To that argument I ask, what does it feel like to be a computer? Do they experience 'qualia' like us? Of course they do, just like humans do when specific parts of the brain are stimulated with electrical impulses. We only think we are different due to how our conscious mind interprets these 'feelings'.

Are you for real? I'm just interested, not making judgments.

Are you saying that I'm hurting my laptop whilst typing this?
 
How do you know computers can experience things? Can calculators? Light switches? Abacuses?

It seems to be all the rage these days to claim that the degree of consciousness a system possesses relates not to its processing power. Rather it relates to the amount of information it generates as a whole that cannot be ascribed to its component subsystems.

Thus I don't think calculators, light switches and abacuses would do too well. Computers?
 
How do you know computers can experience things? Can calculators? Light switches? Abacuses?
How do you know they don't?

Experience
is the knowledge or mastery of an event or subject gained through involvement in or exposure to it. Terms in philosophy, such as "empirical knowledge" or "a posteriori knowledge," are used to refer to knowledge based on experience.
Computers that gather information and learn from it definitely 'experience' things. A light switch might not, but even the simplest devices are gaining ever more computing power, so who knows what a future light switch might 'think'?

tsig said:
My car can go from 0 to 60 MPH in 20 seconds so obviously at the end of one minute I'll be going 180 MPH and at the end of ten minutes 1,800 Mph.
A poor analogy. The performance limitations of a particular motor car do not relate to advances in computer technology.

And you obviously aren't familiar with Moore's Law.
Computer industry technology road maps predict (as of 2001) that Moore's law will continue for several generations of semiconductor chips. Depending on the doubling time used in the calculations, this could mean up to a hundredfold increase in transistor count per chip within a decade...

Lawrence Krauss and Glenn D. Starkman announced an ultimate limit of approximately 600 years in their paper, based on rigorous estimation of total information-processing capacity of any system in the Universe
A better analogy would be a vehicle which is able to accelerate at a constant 3mph/s2 for 2.2 hours, reaching a top speed of 24,000mph.
 
Last edited:
Fudbucker said:
How do you know computers can experience things? Can calculators? Light switches? Abacuses?


How do you know they don't?

Experience

You can't validly ascribe experience to any material thing. So you can't say "the computer experiences." You can say "I experience," but you can't validly say "the brain experiences." The brain processes. You can say that experience may be emerging from processing activity.

Just to be usefully pedantic.

Computers that gather information and learn from it definitely 'experience' things.

What's your basis for making that claim? Can you, for example, demonstrate that the whole computer creates more information than the sum of its component parts? Or do you have another way?

Just to be clear, I'm not criticising your statement. I'm just interested.
 
Last edited:
Are you for real? I'm just interested, not making judgments.

Are you saying that I'm hurting my laptop whilst typing this?
Humans feel pain. Animals feel pain. Even microscopic rotifers have a brain and nervous system. They may only have 230 neurons, but they can still feel.

Now imagine a computer designed to 'feel' damage being done to it. Actually you don't have to imagine it, because modern computers do have subsystems which are designed to respond to 'pain'. Compared to the human nervous system it may be very crude, but the principle is the same. It wouldn't be too hard to make to make a robot with 230 'neurons' and 5 light sensor 'eyes' etc. that responds to stimuli exactly like a rotifer. If a rotifer can feel pain, then so would this robot.

But how could a computer - which is just a collection of wires and transistors - get the kind of thought patterns that make human brains 'conscious'? The answer is that we make them that way. We make computers to mimic our own brains. When a programmer writes code to do a certain task, they are embedding into it their own logical thought processes - and perhaps even a part of their own personality.
 
How do you know they don't?

ExperienceComputers that gather information and learn from it definitely 'experience' things. A light switch might not, but even the simplest devices are gaining ever more computing power, so who knows what a future light switch might 'think'?

A poor analogy. The performance limitations of a particular motor car do not relate to advances in computer technology.

And you obviously aren't familiar with Moore's Law.

A better analogy would be a vehicle which is able to accelerate at a constant 3mph/s2 for 2.2 hours, reaching a top speed of 24,000mph.

You're making a positive claim that they do. The burden of proof is on you. Maybe they do, maybe they don't. If you're asserting a position, it's fair to ask what your evidence is.
 
You can't validly ascribe experience to any material thing. So you can't say "the computer experiences." You can say "I experience," but you can't validly say "the brain experiences." The brain processes. You can say that experience may be emerging from processing activity.
But I can, because the 'I' doesn't have to be consciously aware of an experience for it to have occurred. But even if you define 'experience' as only those things you are consciously aware of, there is no theoretical reason why a computer couldn't have a 'Master Control Program' which monitors its own activities to become 'aware' of what it is experiencing.

What's your basis for making that claim? Can you, for example, demonstrate that the whole computer creates more information than the sum of its component parts? Or do you have another way?
How does a human create more information than the 'sum of it's parts'? Or do we? The amount of information we make use of is vastly less than what we take in, with no guarantee that what's left will be accurately preserved.

My father recently died of Alzheimer's. Towards the end it was clear that he was unable to store and process new information. he became like a simple computer that behaves according a fixed program in ROM, unable to learn from experience or modify its thought processes. You could have switched him out for a robot that was preprogrammed with a limited set of responses, and I would not have been able to tell the difference. Yet he was still experiencing things - even if he couldn't remember or make sense of them.
 
Humans feel pain. Animals feel pain. Even microscopic rotifers have a brain and nervous system. They may only have 230 neurons, but they can still feel.

Now imagine a computer designed to 'feel' damage being done to it. Actually you don't have to imagine it, because modern computers do have subsystems which are designed to respond to 'pain'. Compared to the human nervous system it may be very crude, but the principle is the same. It wouldn't be too hard to make to make a robot with 230 'neurons' and 5 light sensor 'eyes' etc. that responds to stimuli exactly like a rotifer. If a rotifer can feel pain, then so would this robot.

But how could a computer - which is just a collection of wires and transistors - get the kind of thought patterns that make human brains 'conscious'? The answer is that we make them that way. We make computers to mimic our own brains. When a programmer writes code to do a certain task, they are embedding into it their own logical thought processes - and perhaps even a part of their own personality.

Does a thermostat "feel" hot or cold?
 
You're making a positive claim that they do. The burden of proof is on you. Maybe they do, maybe they don't. If you're asserting a position, it's fair to ask what your evidence is.
There is no doubt that computers gather information, store it, process it, and learn from it. That is what they do. The only question is can this be called 'experience', or does it only apply to humans.

So the 'proof' rests on whether we think that humans are somehow fundamentally different from every other entity in the universe - that somehow we are more than an arrangement of cells which are themselves a collection of atoms etc. acting solely according to the Laws of Physics. To insist that only humans can experience the world around them is the positive claim.

If you were to trace in precise detail the workings of the human brain, you would see a variety of processes involving quantum phenomena, electrochemical reactions etc. culminating in what some call 'qualia' or the 'feeling' of experiencing something. But these processes are not fundamentally different from the workings of a computer - only it is made of silicon transistors rather than organic molecules.

My argument is that if you were to duplicate the processes in the human brain in a computer that it would 'experience' things just like we do, even though it is 'just a machine' made of wires and transistors. And if everything in the universe follows the same laws of physics, then there is no theoretical reason why we couldn't make such a computer. Current computers are not powerful enough to precisely emulate a human brain, but that doesn't mean they don't experience what little they can do - just like we can't say that a mouse doesn't experience things just because it only has 71 million neurons compared to our 86 billion.

That is unless you arbitrarily define 'experience' as something only humans can do - like saying that a monkey can't take a selfie because the word 'photographer' only applies to human camera operators. But is that the kind of semantic game you want to play?
 
Last edited:
At some point brains/nervous systems evolved to a level where they not only detect, map and react to our environment, like amoebas might do, but insert the creature itself into the world model they create as a thinking subject. Only at that level you can talk about feeling without conflating or over extending language.
 
Does a thermostat "feel" hot or cold?
Perhaps - depending on what definition you use.

feel
3. To receive information by touch or by any neurons other than those responsible for sight, smell, taste, or hearing.

Since temperature is sensed through the skin it is considered to be 'feeling' hot or cold. The question is whether this sensory ability should only be called 'feel' when humans do it, or whether it can be applied to other entities. Certainly other animals 'feel', though most people would would say that plants don't. the difference is that animals have a nervous system that transmits the 'feeling' to the brain, where it is 'experienced'.

So a thermostat 'feels' heat like the nerve endings in skin do, but is this information transmitted to a 'brain' to be 'experienced'? A simple thermostat with a bi-metallic strip simply bends to make or break a contact as the temperature changes, like a plant responding to heat by wilting. But some modern electronic thermostats actually have a tiny computer inside, which processes and records temperature readings so that it can 'learn' from the 'experience' to better regulate room temperature under different conditions.

And those 'smart' thermostats were programmed by humans to do what we would do if reading dials and operating levers to control the A/C equipment manually. So in effect we have imparted the thermostat with some our own though processes, giving it the ability to 'feel' and 'experience' hot and cold on our behalf.
 
Last edited:
To run with such extended meanings of words like sense or feel is to unjustifiably anthromorphise thermostats, some bio molecules, contemporary computers, toasters or probably insects. They may map and model their environment but their brains/circuits/mechanisms don't have the complexity of positing/producing an experiencing self. We understand how thermostats work and there is no room for such a thing or other evidence for believing it occurs.

I certainly don't rule this out for machines at some point or shades or grey in the biological world.
 
Last edited:
Because of its evolved nature, language is so poor at pinning down what we want to talk about while avoiding unjustified assumptions.
 
Materialism as used by most of the sciencey posters here is taken to be a form of naturalism, it manes that those things that can be observed, measured and given metrics have a high level of validity in expressing theoretical models of the way the universe behaves.
That is a rather broad area actually, in Cognitive Behavioral Therapy an huge number of cognitions and moods can be tracked as part of the intervention.

I am reluctant to link to a specific tool kit (because they come from sources unknown to me) so I have to read some before I get back to you.


Aaron Beck started a trend with the Cognitive Therapy of Depression
http://www.amazon.com/Cognitive-Depression-Guilford-Psychology-Psychopathology/dp/0898629195


But I want to do some more reading before recommending mood assessments from the web.
:)

Thank you, but I cannot read a whole book now. I was searching for an article only.
It seems that Beck's work is aimed to measurement of depression, not emotions in general.

Anyway, I only know some studies about the emotions that measure the answers of subjects, either in behaviour or with introspective answers.
They are not very exact.
There is not a consensus in psychology about what different methods of measurement.

Therefore, the psychological measurements of some emotions (the strongest emotions usually) are referred only to some partial answers to emotions. These answers can measure of something: but what kind of thing mental or neuronal?

I don't know any method of objective measurement of all emotions, even the soft feelings.
 
Not a particularly fair or meaningful question, really. How often have units of measure been named before the actual thing in question can be measured in a notable fashion and why would being able to produce the name of such actually mean anything for either your position or anyone else's on the general topic?

My comment was an answer to the concept of matter proposed by Dancing David. Fair or unfair I don't know. But the mathematical concept of matter is a long-standing interesting issue (Descartes).
 

Back
Top Bottom