The Hard Problem of Gravity

As always with AI, I'll believe it when I see it.

There is OpenCyc, so you can see and test it right now. There are many ANN's out there, so you can see that too. And of course there are sensors for all kind of stuff too.

What are you missing in your sight?

Greetings,

Chris

Edit: If Cyc knows that water is wet, and if it knows that the ocean consists of a lot of water, and you ask what happens to Bob when he swims in the ocean, then it will tell you that Bob is wet. It learns from it's input and can draw correct conclusions. It can reason on its own, based on its knowledge. All you need is to feed it that knowledge. That could be done with said ANN's and sensors, instead of the current method of manual data entry. It would result in pretty much the same method that humans use to learn. Picking up sensory data, evaluating it, observe things, and draw conclusions from that gained knowledge.

Just because there is no 100% perfect, ready-to-use-for-everybody system out there yet doesn't mean that large chunks of the needed subsystems don't exist. They do. Right now, at this time, it's just a matter of connecting them together in a meaningful way.
 
Last edited:
A simulation is an attempt to show the effects of a physical interaction without actually performing it. An emulation is an attempt to show the effects of a physical interaction by duplicating the physical principle.

All computer models of real world events are simulations. Some physical models can also be simulations. Other physical models attempt to actually create the physical effect that is to be investigated.

So an expensive flight "simulator," of the type the military uses to train pilots, is not actually a "simulator" but rather an "emulator" due to certain physical principles being duplicated? After all, when the pilot pulls back on the stick, the cabin tilts up and the pilot can feel the physical effect.

And a video game "emulator," that you can run on your PC to play games from deprecated systems, is in fact a "simulator" rather than an "emulator" because no physical principles are being duplicated?

But a digital simulation will never produce the physical effect.

The only condition under which this assertion is true is if "physical" is defined to be mutually exclusive with "simulated." And in that case, duh.

If something is not part of a simulation, then obviously that simulation can't affect it. How stupid do you think we are? Do you honestly think that is what we are claiming, that a simulated car could run you over? This discussion is really trying my patience...
 
To ensure that we are using language in the same way could you provide a few examples? I know of many examples in which we decide amongst different behaviors unconsciously, but that is a slightly different issue.

I realize that "behavioral tendency" is a thoroughly vague term.

Well, I think our emotions and certain sensations are a subclass of "behavioral tendencies"; they influence our conscious actions. In a context outside of a conscious mind inputs into a system could act as "behavioral tendencies" that influence the overall behavior of the system.

Basically, I agree that in a general sense your original statement is accurate [i.e. 'feelings' act as behavior tendencies] but that alone does not help distinguish why 'feelings' are felt as such.
 
No. Why should it be?

Anyway, code that is aware. Stimulus - something to be aware of. Memory - to know that you have been aware of something. And conditional response - because otherwise you're merely reacting, not aware.

Meet Alice the friendbot!
Code:
Man, that's cute! I did not even think you would get through the troubles of writing it. I must respect your effort.

It seems to me that they are 2 camps within this thread: those who deny consciousness anything but information processing and those who posit there is that something else to it that they cannot describe, which is the cause of experience. I am going to address the mechanists.

The reason I am in the opposite camp is because I cannot put that which I feel into information processing. For me the "Hello World!" program is not greeting me unless it understand the meaning of what a greeting is. You will say "but show us what that meaning is?" and I will say "show me it's in the program!" and we will end up where we started.

The fact that you assume that it's in the program does nothing to convince me more than putting vegetables in a pot to make a soup and tell me that the pot knows the recipe. Now maybe at quantum level my vegetables will have an experience but this would have nothing to do with the mechanistic approach you have used so far! So don't even dare go there unless you want to be qualified as turncoat. A computer is just a bunch of transistors that take 0 or 1 as values, a bus and a CPU. This is all there is before running friendbot, this is all there is while friendbot is running and this is all there is when it has finished. Now that people believe that meaning or feeling or experience miraculously appear within the box just because a few transistors appear to change value is beyond me. This is true especially when I know that I myself is unable to translate experience in terms that would allow me to emulate it on a computer or to somebody else. And no, the fact that I cannot show my feeling to you does not mean that I am hanging on something that is not real, it just means that it is problematic and this is where HPC comes in.

You people can deny it all you want since this is the premise by which you go and believe that self-referential information processing is key to your theory. Well, I am sorry but self-referential information processing is nothing but... information processing and I do not have to assume that it experiences unless this is a proven fact. However, I know you cannot prove that because even I could not prove my feelings to you either... So, to simplify the matter you just prefer to deny that there is anything to prove and hide behind that.

The difference between us is that I refuse to believe that my feeling is information processing unless I can understand it in these terms, which I don't. But you guys are readily assuming that I should not be worried about it. Well, I am sorry but this is dishonest unless you can write me a line of code which for example would make the computer experience pain. If we are machines and if all we do is processing information then it must be possible. So show me!
 
How do you know the monitor isn't you?

Inner dialogue and identification with sensation. Remove these subjective phenomena and there's no self-reference, yet there is still sensory consciousness. The monitor is still there.

Nick
 
Last edited:
AkuManiMani said:
For the same reason why we use terms like 'genes', 'chromosomes', or 'alleles' instead of just saying 'hereditary elements'. Why quail at the term as if its taboo?

Nice attempt at equivocating. Genes and chromosomes are NOT equivalent. You implied that qualia and experiences are.

Belz, I know darn well that genes are not the same as chromosomes -- which is why I used those terms as examples. Experiences are to qualia what chromosomes are to genes [tho, it seems codons would be more directly analogous to qualia]. Get it?

AkuManiMani said:
And what in blue blazes is a 'soul-like thing' anyway???:confused:

It's the ineffable thing that makes the mind more than the sum of its parts which, of course, is a ridiculous concept.

The point of the term qualia is to designate a term for elements of experience. The term itself is inherently reductive.

With that said, I would like you to elaborate as to why the concept of emergence is necessarily "ridiculous"?

People are so enamored to the idea that their mind is "special" that they will argue anything to maintain belief in it.

You're going to have to define what you mean by 'special' .

Right now, I'm assuming by 'special' you mean distinguished from other entities. The very point of defining anything is necessarily to distinguish it in some way from other things. Unless you want to argue that everything is mind, or has a mind, I don't see how you could claim that it isn't 'special'.

AkuManiMani said:
To qualitatively experience is qualia; to introspect is to directly observe qualia.

So you have qualia about qualia ?

Exactly. Thoughts about thoughts. Feelings about feelings. Those are all examples of conscious self-referential processing -- what we call introspection. It seems that primates, especially humans, posses this capacity to a greater degree than other animals.

So, in answer to your question, yes. To be introspectively conscious of one's own qualia necessarily generates a corresponding qualitative experience.
 
Last edited:
A key to all of this, though, is the way our nervous system is organized. What animals do is constantly update information from inside and outside through big information loops -- spinal cord to brainstem to thalamus to cortex, with each step including a loop back to the earlier level and all higher levels looping back to all earlier levels. Information is constantly looping and updating; we appraise a situation unconsciously and then update it based on what has changed or what we change. I view consciousness as the means by which we vary behavioral repsonses based on what might and might not work in any given situation, so it is tied to uncsonscious appraisals and recursive loops.

I think you're on to something in this post. Conscious thoughts are necessarily informed by other, unconscious processes. There definitely seems to be a continuum between the two types of processing [in humans atleast] and any suitable theory of consciousness would have to take this into account.
 
A computer is just a bunch of transistors that take 0 or 1 as values, a bus and a CPU. This is all there is before running friendbot, this is all there is while friendbot is running and this is all there is when it has finished.

Let's rephrase this a bit, maybe it helps you to understand the issue:

A human is just a bunch of neurons and nerves that take "fire" or "dont fire" as values, a spinal cord and a brain. This is all there is before running human-being, this is all there is while human-being is running and this is all there is when it has finished.

(Edit: And the human body is what is the enclosure on the computer, food is power, eyes are cameras, ears are microphones, etc, etc.. I guess you get what i mean)

Well, I am sorry but this is dishonest unless you can write me a line of code which for example would make the computer experience pain. If we are machines and if all we do is processing information then it must be possible. So show me!

Describe "experience pain". If i hold a flame to your finger, you will feel with your skin that it gets way too hot and retract your finger. If the CPU in my computer runs too hot it feel that through the temperature sensor and it clocks down.

What is the difference here? That you can scream "ouch!"? Well, my computer can inform me about the CPU being too hot as well. I could even make it play back a soundfile then, which would yell "ouch!" then.

I think that people of your kind confuse information processing with the use of language. A computer may not know the meaning of the word "hot", because it has not learned so, but it has a sensor that measures temperature. Just because someone, somewhere, in a time long gone invented words like "hot" or "cold" doesn't change the underlying sensory perception. One could have invented the words "umbf" and "largz" instead. They are just words. Our brains give these words the meaning they have. And that meaning comes through learning.

Without us learning a lot of things, and a lot of words to describe these things, all that is left is what you call "feeling". And that feeling is nothing more than the sensory input that we get from our, well, senses. Just because we put it into nice words doesn't mean that there is anything more to it than there really is.

Why is a circle round? Why is a circle called a circle? Why is it not a "hurgs" that is "wump"? What would change if the circle is a hurgs, and it is not round but wump? Exactly, nothing would change except the words.

Now, what then is the difference of 4 volts output of a temperature-sensor, fed into an ADC that in turn is evaluated by a CPU, compared to nerves triggering a hell lot, signaling to our brain, which in turn fires a lot of neurons to evaluate the event? Exactly. Nothing. If our brain does not know about hot and cold, nothing would happen. If we train it what to do in cases of hot and cold (or if there are evolutionary, pre-determined actions), then there will be a reaction. Same goes for a CPU. If if doesn't know what to do with the temperature sensor readout, nothing will happen. But if we program it (or if there is a BIOS that has pre-defined code), then there will be a reaction.

So, again, what do you think is the difference? What makes up "experience"? What makes up "feeling"? Isn't it all just processing of sensory input, that processing being either by learned/trained experiences, or by evolutionary processes?

I would say it is, and there is exactly no difference.

Greetings,

Chris
 
Last edited:
It seems to me that they are 2 camps within this thread: those who deny consciousness anything but information processing and those who posit there is that something else to it that they cannot describe, which is the cause of experience. I am going to address the mechanists.

Theres also a camp that posits that consciousness is a form of information processing that simply hasn't been defined. Based on current scientific understanding, all processes are inherently informational processes. Simply stating that consciousness is 'information processing' tells us nothing.

[Oh and FYI, don't expect much in the way of a reasoned response from Pixy. I have a sneaky suspicion that hes simply a chat-bot programed to argue for the strong AI position >_>]
 
Last edited:
The difference between us is that I refuse to believe that my feeling is information processing unless I can understand it in these terms, which I don't. But you guys are readily assuming that I should not be worried about it. Well, I am sorry but this is dishonest unless you can write me a line of code which for example would make the computer experience pain. If we are machines and if all we do is processing information then it must be possible. So show me!

Feelings have been well described IMO as being the "executors of evolutionary logic," and for me there is essentially a logic to them. Thus, the evocation and behavioural response to feeling is likely analogous to information processing, if you ask me.

Of course, this does not mean that one can easily get a computer to feel pain. But with this I think you have to also consider that computers haven't evolved into being over a few billion years.

These things don't make the HPC invalid, but they do provide avenues for research that may one day do so.

Nick
 
Let's rephrase this a bit, maybe it helps you to understand the issue:

A human is just a bunch of neurons and nerves that take "fire" or "dont fire" as values, a spinal cord and a brain. This is all there is before running human-being, this is all there is while human-being is running and this is all there is when it has finished.

(Edit: And the human body is what is the enclosure on the computer, food is power, eyes are cameras, ears are microphones, etc, etc.. I guess you get what i mean)

Describe "experience pain". If i hold a flame to your finger, you will feel with your skin that it gets way too hot and retract your finger. If the CPU in my computer runs too hot it feel that through the temperature sensor and it clocks down.

What is the difference here?
OK Chris, I tell you what the difference is. When I feel pain I do not have to convince myself that I do. I might not be able to show you my pain but I feel it and this is enough to convince me it is real.

However, the difference between me and a machine is I can have a look at the machine and how it processes information. When I look at it, I can try to see whether there is a process in there I could interpret as reproducing a feeling. After all, you say, the processing of information is all there is, so it is fair to approach the problem this way! If I am just a machine, then a machine should be able through information processing to convey a feeling I could identify with my owns. Problem is, every time I looked there was nothing of the sort. Hence, I do not have to assume that machines are capable of it whereas I can assume my feelings are real. Here is the difference!

Also, be careful not to confound a program with your understanding of it. It is not because you understand a program that putting it in the computer will transfer this understanding to the computer. This is no more true than putting the vegetables in the pot, transfer the knowledge of the recipe to the pot.
 
[Oh and FYI, don't expect much in the way of a reasoned response from Pixy. I have a sneaky suspicion that hes simply a chat-bot programed to argue for the strong AI position >_>]
Man you have to respect Pixy. He came out of some strange closet! :)
 
Why is a circle round?
Easy, we define it that way. However awareness of feelings come before its definition.

From an epistemological standpoint, yes, they are.
Next time you hurt your foot you just keep telling yourself that!

How do I know you are not an unfeeling zombie?
[/QUOTE]
Honestly, you don't. But you are not going to convince me by denying me feelings anyway so what is your point?
 
The difference between us is that I refuse to believe that my feeling is information processing unless I can understand it in these terms, which I don't. But you guys are readily assuming that I should not be worried about it. Well, I am sorry but this is dishonest unless you can write me a line of code which for example would make the computer experience pain. If we are machines and if all we do is processing information then it must be possible. So show me!

It takes quite a bit of education in the subject to be able to understand something as complex as human feeling in terms of pure information processing. Explaining it clearly isn't feasible on a forum like this.

However, if you are really interested, you should read up on formal logic, reasoning systems, and neural networks (and yes, it must be in that order, or else you won't understand any of it).

Simply put, all forms of human experience is simply reasoning about facts in various ways. Honest introspection of your own thoughts should reveal this, unless you are a woo like Nick227 who insists people can be conscious without thinking at all.
 
Next time you hurt your foot, keep repeating: this feeling is only how I define the processing of sensory input.

Why? What would that prove?

The negative feedback input being processed is not stopped by identification of it as negative feedback input.

Nor has anyone claimed it would be.
 
It takes quite a bit of education in the subject to be able to understand something as complex as human feeling in terms of pure information processing. Explaining it clearly isn't feasible on a forum like this.

However, if you are really interested, you should read up on formal logic, reasoning systems, and neural networks (and yes, it must be in that order, or else you won't understand any of it).
Blah blah blah! Why don't you point out what is so different from common algorithmic in these books that would make it so obvious that machines are capable of experiencing.

Also don't worry I have done enough formal logic and algorithmic at university to understand these books.

Simply put, all forms of human experience is simply reasoning about facts in various ways. Honest introspection of your own thoughts should reveal this, unless you are a woo like Nick227 who insists people can be conscious without thinking at all.
Blah, blah, blah. I don't need to introspect. When I am hurt, I am hurt! Apparently, buddhist monks can do what nick227 suggested when they meditate, why don't you study their books for the sake of honesty.
 

Back
Top Bottom