• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
So model water can be wet?

liquid carbon tetrachloride can be wet?

If liquid carbon tetrachloride is a good model to use in the place of water for your purposes, then it's a good model to use in the place of water for your purposes. The question of whether or not it's "wet" is irrelevant, unless of course you need your model of water to be wet in order for it do to whatever you need it to do.
 
Well, you look at the difference between a human and a squirrel, and then see what more you could remove and have the thing still be conscious. Maybe you could whittle it down to something like a very expensive toaster.

Or are you going to arbitrarily label a squirrel as the least complex creature that is conscious?

Hmm?

I fail to understand any of that.

If a squirrel's brain does the same kind of thing my brain is doing when it's conscious, then a squirrel is conscious. If not, then it's not.

Toasters are not designed and built to be conscious, so they are not.

Removing parts from a squirrel to turn it into a toaster is... well, I have no clue what that is.
 
I didn't say consciousness.

I said self recognition. Self reference.

And, yes, we do know if such behavior can result from something else. It cannot. Self reference can only result from self reference.

This thread is about consciousness, not self-reference.

"Self-recognition" is a very mushy term that could mean quite different things in different circumstances.

When my truck's computer monitors the engine's performance, you could say that this is "self-recognition" on the truck's part, but this has nothing to do with consciousness.
 
If I say something to you, then it's true that I spoke -- that is, I made noise with my mouth -- even if you can't understand me for whatever reason.

On the other hand, when you look at the screen, if you can't interpret the readout, then it's not true that "addition" has taken place. All that's true is that the screen lit up.

It's just like the abacus. In objective physical reality, all that happens is the someone moves beads. The addition is entirely in that person's imaginations. Digital calculators are no different.


In what way is it not true that addition has taken place? Addition is defined as a mathematical function. If the computer follows the steps and performs that mathematical function, whether or not I look at the screen it is still the case that it performed the function. It added. The function is already defined.

An abacus falling is not calculation because not only is the input random, but the 'calculation' is random. It does not follow the steps of addition in a controlled way; it does not follow the paradigm.
 
Information, interpretation, and meaning would exist for the builder. Where do you suggest it exists for the model itself? I see none therein.
I was simply questioning whether you'd use those terms when describing the operation of a simple device.

I may go with "intent is just part of the fabric of space-time". Thanks, wasp. :D
Please explain what you mean - or is it a meaningless statement?
 
In what way is it not true that addition has taken place? Addition is defined as a mathematical function. If the computer follows the steps and performs that mathematical function, whether or not I look at the screen it is still the case that it performed the function. It added. The function is already defined.
Sure, but not defined by the computer; without a sentient observer nothing occured.

An abacus falling is not calculation because not only is the input random, but the 'calculation' is random. It does not follow the steps of addition in a controlled way; it does not follow the paradigm.
I suspect though we agree getting jostled randomly has equal meaning to the abacus as any other operation that could be performed on it.
 
You're jumping the gun.

If you've got a very simple organism like that, it really makes no sense to talk about what the touch "means". It's just that the thing has evolved to react in certain ways.

And if you take physical science seriously, our brains are doing only that, too... but in an enormously complex way.

It's all physio-energetic action and reaction, nothing more.

Of course, it's so complex (and, so far, unamenable to direct research) that we're forced to describe situations in other terms.

I saw the STOP sign, I understood what it meant, so I stopped the car.

But that's a shortcut for whatever is really going on, on a purely physical level -- which is the only real level, as far as we know.


Meaning must arise through a physical process. In what way is the contextual process described not amount to meaning? It includes a definition of information, specified information, contextual environment and 'programmed' intent.

I am not arguing that such a simple organism understands consciously. That is an entirely different issue. I am arguing that its actions have meaning -- the meaning concerns survival. I also don't mean this to explain all there is to 'the meaning of meaning'; perhaps proto-meaning is a more proper term. The important issue is that it introduces a form of intent without a conscious mind being involved to the processing of information.
 
Last edited:
Sure, but not defined by the computer; without a sentient observer nothing occured.

The sentient observer defined the function when she built the computer.


I suspect though we agree getting jostled randomly has equal meaning to the abacus as any other operation that could be performed on it.


Yep.
 
dlorde said:
Information, interpretation, and meaning would exist for the builder. Where do you suggest it exists for the model itself? I see none therein.
I was simply questioning whether you'd use those terms when describing the operation of a simple device.
Answered. See the part I bolded?

Now what's your answer to my question?

I may go with "intent is just part of the fabric of space-time".
Please explain what you mean - or is it a meaningless statement?
If you find the statement meaningless, your choice.
 
My bad for imprecision. Yes a sentience defined it, and if a sentience doesn't a some point interpret the output, the output had no meaning.

Right. But the machine still adds. Doesn't matter if the output is interpreted for it to add. It adds following the same process it does when we watch it -- a process that we define for it.
 
Right. But the machine still adds. Doesn't matter if the output is interpreted for it to add. It adds following the same process it does when we watch it -- a process that we define for it.
Definition is not the key; something sentient must understand -- give meaning to, experience as useful -- the outout.

Without sentience to recognize addition occured the register values or printout or crt display has no more meaning than the pile of scree after a rockfall.

Or we may have reached an agreement to disagree point? :(
 
A brain-in-a-vat would not be able to indicate that it's conscious. That does not entail that it's unconscious.

And how would you tell it is conscious? Perhaps you propose that some sort of brain scan would tell you, yes?

How do we know those scans are accurate? Because we know how they relate to the actual behavior of people who are alert and able. So in the end, being able to tell anything about a brain in a jar is thoroughly rooted in observations of behavior (and how that behavior relates to brain activity).
 
Where do you get that last sentence from?

You simply leap to it. It's a non-sequitur.

Yes, if you replace a brain with some physical thing that carries out the same tasks, well, you've replaced it with an artifical brain.

However, one cannot replace any physical object with a digital simulation of that object, because digital simulations are abstractions, not real things.

If you want to claim the brain is an exception, you have to explain how this could possibly be true; you can't simply assert it.

If you want something that's going to physically accept the electro-chemical inputs and to physically output the necessary electro-physical stuff that will keep the chain reaction going outward toward the rest of the body, you need a physical apparatus, not a simulation of such an apparatus.

So, what will this artifical brain look like? How will it work?

We can't say with any level of detail because we don't know how the brain does most of what it does.

But we do know that the brain does not run simulations, so we can be confident that it can't be replaced with a machine that runs simulations. It has to be an object that does brainy kinds of stuff, physically.

It can't be an object that does other sorts of stuff physically.

A computer running a simulation would work perfectly. We can already induce nerve impulses with electricity. We can induce different signals and modify the strength of those signals. We can also read signals running up a nerve. Heck, we do stuff like this to control artificial arms and with research are artificial eyes.

You run a simulation of a brain. You have a device that reads nerve signals that would be sent into the brain, software then translates those results into signals sent to the appropriate nerve in the simulation. For output you do the reverse.

The interface between the simulation and the brain is trivial compared to the simulation itself.
 
Right. But the machine still adds. Doesn't matter if the output is interpreted for it to add. It adds following the same process it does when we watch it -- a process that we define for it.


It's a fine distinction, eh. Does a watch keep time if no one watches it? Sounds like a yes. Does it tell time? Sounds like a no.

In the same sense, maybe the machine is keeping sums but not telling sums. (In our now thread-famous A-B-C info chain, with no one around you have A [action] & B [intermediary], but no C [reaction], at least as far as the print-out goes; internally, if the sums it's keeping are being passed as I/O between functions, one might argue the A-B-C chain is complete, that the machine is telling sums to itself, fwiw; however, one can still argue without some sort of awareness of what mathematics is for, it's "meaningless" information; and of course in this second, 'emotional' sense of "meaning", roughly enhancement of survival, the information is meaningless, until you establish the machine is capable of that sort of awareness).
 
Last edited:
Definition is not the key; something sentient must understand -- give meaning to, experience as useful -- the outout.

Without sentience to recognize addition occured the register values or printout or crt display has no more meaning than the pile of scree after a rockfall.

Or we may have reached an agreement to disagree point? :(


I guess we have to, because addition is a process following a particular algorithm. That is how it is defined. The computer actually does something physical with the program running -- it moves those electrons around to perform the algorithm and arrives at a sum. That action is translated back out onto a screen or sheet of paper. Are we really saying here that the action didn't take place or that it didn't constitute addition until I look at the number produced?

Let me try an example -- let's say we have a computer add a row of numbers and we get as output another row of numbers. I look over the row of numbers that constitute the output, but my dog is barking and I'm distracted and miss one of the numbers. Is it really the case that the computer added all the numbers that I looked at but did not add the numbers to get the sum that I overlooked? So if I go back and look at the sums later, then it added it?
 
It's a fine distinction, eh. Does a watch keep time if no one watches it? Sounds like a yes. Does it tell time? Sounds like a no.

In the same sense, maybe the machine is keeping sums but not telling sums. (In our now thread-famous A-B-C info chain, with no one around you have A [action] & B [intermediary], but no C [reaction], at least as far as the print-out goes; internally, if the sums it's keeping are being passed as I/O between functions, one might argue the A-B-C chain is complete, that the machine is telling sums to itself, fwiw; however, one can still argue without some sort of awareness of what mathematics is for, it's "meaningless" information; and of course in this second, 'emotional' sense of "meaning", roughly enhancement of survival, the information is meaningless, until you establish the machine is capable of that sort of awareness).


Excellent way to put it. Yes, it would keep sums because it follows the algorithm that defines what addition is but not tell them if we do not see the output. Yes, if we do not see the output, then the information is meaningless -- just as a foreign language may be meaningless to us or if someone speaks and the noise level is too high for us to interpret their speech.
 
You're jumping the gun.

If you've got a very simple organism like that, it really makes no sense to talk about what the touch "means". It's just that the thing has evolved to react in certain ways.

And if you take physical science seriously, our brains are doing only that, too... but in an enormously complex way.

It's all physio-energetic action and reaction, nothing more.

Of course, it's so complex (and, so far, unamenable to direct research) that we're forced to describe situations in other terms.

I saw the STOP sign, I understood what it meant, so I stopped the car.

But that's a shortcut for whatever is really going on, on a purely physical level -- which is the only real level, as far as we know.


Let me try to answer this again in slightly more detail because I think dlorde's way of reforming it as a single celled organism (how it would actually have originally evolved in the first place) is better than my clumsy attempt (and his example is better than the one I offer below because he adds more details but I want to focus on just a few aspects).

I think it does make sense to speak of what a touch means even with purely instinctual responses. It does not make sense to speak of the organism understanding the meaning, though. Clearly I am not referring to linguistic meaning here.

Let's say we have a single cell organism with three receptors -- one that senses simple sugars, which it uses for food; one that senses the presence of sulfur, which will kill it in high concentrations; and one that senses the presence of other organisms of its kind in order to facilitate DNA transfer (say, each organism excretes an identifying peptide). Each of these receptor proteins links to the cytoskeleton in order to produce movement either toward or away from a stimulus. The behavior of such an organism would change depending on its chemical environment. It only responds to three things with either approach or avoidance, but each of those signals has a meaning for this organism -- it is a particular type of signal that alters its behavior in order to enhance its survival.

This organism exists in an environment in which all sorts of environmental issues arise. It can run into streams of arsenic, but it has no receptors and arsenic does nothing to its internal function, so arsenic is not meaningful to this organism. It can be jostled by a swimming fish -- something that clearly changes the organism (the fish swimming by constituting information) -- but being jostled by a fish ends up having no positive or negative impact on this organism's survival. So, while the swimming fish is information (poor information, it turns out because many other things could jostle the cell), it is not meaningful information. We could imagine numerous other examples, but the point is that some data is meaningful to such an organism and some is not, where meaning arises in enhanced survival (the reason for the behaviors in the first place).

Of course we speak of many other types of meaning, but on what grounds does this not qualify as meaningful information for such an organism?

ETA:
Or, if it helps, replace meaning or meaningful with a synonym -- significant.
 
Last edited:
In what way is it not true that addition has taken place? Addition is defined as a mathematical function. If the computer follows the steps and performs that mathematical function, whether or not I look at the screen it is still the case that it performed the function. It added. The function is already defined.

It is true on an abstract level. That is, as long as you can interpret the sign system.

But on the level of objective reality, no.

If I pick up a STOP sign and hit you with it, you get injured, regardless of whether or not you read English. That's objectively physically real.

This is not the case when it comes to getting "information" from the sign, or from the calculator or the abacus.

The calculator has not performed any mathematical function in any objective physical sense.

Rather, like the abacus, it has performed a physical function which we have set up to symbolically represent a mathematical function only if we are able to read the symbol system.

This difference is crucial. And if we ignore it, we will come to erroneous conclusions with regard to physical systems such as the brain.

Consciousness is not an abstraction. It is a behavior.
 
Last edited:
Meaning must arise through a physical process. In what way is the contextual process described not amount to meaning? It includes a definition of information, specified information, contextual environment and 'programmed' intent.

I am not arguing that such a simple organism understands consciously. That is an entirely different issue. I am arguing that its actions have meaning -- the meaning concerns survival. I also don't mean this to explain all there is to 'the meaning of meaning'; perhaps proto-meaning is a more proper term. The important issue is that it introduces a form of intent without a conscious mind being involved to the processing of information.

Why are you talking about "meaning" at all?

Would you presume to describe black holes or supernovas in terms of "meaning"?

Of course not.
 
Status
Not open for further replies.

Back
Top Bottom