• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Explain consciousness to the layman.

Status
Not open for further replies.
What about when your cursor moves across your computer monitor ? How can you describe its motion in physical terms ?


Indeed. That's a form of motion that, while it can be described in physical terms, isn't the same kind of thing at all. Such apparent motion can exceed the speed of light, for example.

Considering I input instructions into computers all the time, the bolded word seems poorly chosen.

Well, I do hate to personalise the issues, but I'm assuming that you're a human being. The instructions come from you.
 
Indeed. That's a form of motion that, while it can be described in physical terms, isn't the same kind of thing at all.

The cursor can be described in physical terms ? Where is it, exactly ?


Well, I do hate to personalise the issues, but I'm assuming that you're a human being. The instructions come from you.

The instructions can also come from a computer or some inanimate process like pressure on a barometer, for instance. You are intentionally picking human agency to support your point about anthropocentrism.
 
The problem is the assertion that the running of an "algorithm" results in consciousness.
Why is that a problem?

This is a non sequitur, not least given the problems with a definition of consciousness.
How is it a non-sequitur?

Once this assertion is demonstrated producing a consciousness in the lab I will accept it and que up at the teleportation device.
Now that is a non-sequitur.
 
The cursor can be described in physical terms ? Where is it, exactly ?

It depends. You can have an anthropocentric view, which depends on what it looks like. Or you can go with the physical description of different pixels being lit in succession. However, the two things are distinct.

Motion as a physical phenomenon has a physical description. Motion as a subjective phenomenon has a subjective description.

The instructions can also come from a computer or some inanimate process like pressure on a barometer, for instance. You are intentionally picking human agency to support your point about anthropocentrism.

Human agency is involved in all computer systems. The meaning of the algorithm comes from the purpose. A system with a computer is not distinct from a system with a computer and a barometer. The description of the system relates to the purpose for which its used.

It's possible to provide a physical description of any single algorithmic system, obviously. I'm looking for a general description.

EDIT:

It's possible for a computer system to be entirely self-contained. It might, for example, be a standalone system that controls the level of water behind a dam. When the level of the water reaches a certain depth, a gate will open to release water, and close when the water has dropped again. Such a system might continue to operate after all human being had died.

However, if we are to identify the algorithm associated with the system, in an unambiguous, physical way - as opposed to a mere physical description of the operation of the system - then it is difficult to see how we can do so without reference to human agency and purpose.
 
Last edited:
However, if we are to identify the algorithm associated with the system, in an unambiguous, physical way - as opposed to a mere physical description of the operation of the system - then it is difficult to see how we can do so without reference to human agency and purpose.
You seem confused.
 
It depends. You can have an anthropocentric view, which depends on what it looks like. Or you can go with the physical description of different pixels being lit in succession. However, the two things are distinct.

Motion as a physical phenomenon has a physical description. Motion as a subjective phenomenon has a subjective description.

So many words to NOT answer my question.

Human agency is involved in all computer systems.

We're talking about instructions, not computers.

Your cells are getting instructions from DNA decoding proteins, etc.

It's possible for a computer system to be entirely self-contained. It might, for example, be a standalone system that controls the level of water behind a dam. When the level of the water reaches a certain depth, a gate will open to release water, and close when the water has dropped again. Such a system might continue to operate after all human being had died.

So no human is giving instructions to the computer. You might want to retract your previous statement, then.
 
So many words to NOT answer my question.

I don't know what more you want. You insisted that a non-physical definition of motion is possible. I accepted that, but I don't see what help it is for your position. The physical definition of motion still exists, in a way that a physical definition of an algorithm does not.

We're talking about instructions, not computers.

Your cells are getting instructions from DNA decoding proteins, etc.

And we can express any physical process in terms of instructions, in that case. The problem is exactly the same as with algorithms - the definition is either too wide, or too narrow.

So no human is giving instructions to the computer. You might want to retract your previous statement, then.

If the computer were not instructed, it would not be doing anything.

I've never claimed that a computer was not a physical object that continues to exist in the absence of human beings.
 
When you hypothesize such things always keep in mind EVOLUTION.


So how would such "embellishment modules" come to evolve?

Now we might speculate that modules that serve a primary evolutionary function may also have a side-effect of "embellishing” but I doubt there are modules that are specifically for "embellishment".

I always keep in mind evolution, because it explains everything.

"Embellishment" is probably not a good word. I was just trying to make a distinction between what seems essential for consciousness and what isn't but could be mistaken for essential elements. The brain is full of complicated, messy but nevertheless useful modules. I don't think "primary evolutionary function" makes sense. I think you mean "primary survival function" or "reproductive function," but I'm really only talking about primary functions for consciousness.
 
The cursor can be described in physical terms ? Where is it, exactly ?

I didn't say that the cursor can be described in physical terms.

The instructions can also come from a computer or some inanimate process like pressure on a barometer, for instance. You are intentionally picking human agency to support your point about anthropocentrism.

There is always human agency and intentionality involved in any computer system. That is where the algorithms come from.
 
But is anticipating the future an essential feature of survival? It doesn't matter if you have an actual expectation - just have the right sorts of reactions to stimuli. An organism that didn't have the right sorts of reactions wouldn't live to reproduce.

I was wrong about that. Anticipating the future is helpful, but not essential for survival, unless competing against creatures that can.

I was trying to say that anticipating the future was useful for survival, but probably not essential for consciousness to emerge.
 
I always keep in mind evolution, because it explains everything.

"Embellishment" is probably not a good word. I was just trying to make a distinction between what seems essential for consciousness and what isn't but could be mistaken for essential elements. The brain is full of complicated, messy but nevertheless useful modules. I don't think "primary evolutionary function" makes sense. I think you mean "primary survival function" or "reproductive function," but I'm really only talking about primary functions for consciousness.



This short-ish essay by Mark Twain is slightly antiquated in certain aspects but is AMAZING when you consider the epoch.

If you replace a few words with their more appropriate MODERN equivalents the essay is downright SURPRISING in its prescience.

It is extremely interesting in the light of this thread's topic and general discussion in it.

Mark Twain said:
WHAT IS MAN?


I

a. Man the Machine. b. Personal Merit


[The Old Man and the Young Man had been conversing. The Old
Man had asserted that the human being is merely a machine, and nothing more. The Young Man objected, and asked him to go into
particulars and furnish his reasons for his position.]

[snip great stuff]

We (mankind) have ticketed ourselves
with a number of qualities to which we have given misleading
names. Love, Hate, Charity, Compassion, Avarice, Benevolence,
and so on. I mean we attach misleading MEANINGS to the names.
They are all forms of self-contentment, self-gratification, but
the names so disguise them that they distract our attention from
the fact. Also we have smuggled a word into the dictionary which
ought not to be there at all--Self-Sacrifice. It describes a
thing which does not exist. But worst of all, we ignore and
never mention the Sole Impulse which dictates and compels a man's
every act: the imperious necessity of securing his own approval,
in every emergency and at all costs. To it we owe all that we
are. It is our breath, our heart, our blood. It is our only
spur, our whip, our goad, our only impelling power; we have no
other. Without it we should be mere inert images, corpses; no
one would do anything, there would be no progress, the world
would stand still. We ought to stand reverently uncovered when
the name of that stupendous power is uttered.

[snip more great stuff]
 
I didn't say that the cursor can be described in physical terms.



There is always human agency and intentionality involved in any computer system. That is where the algorithms come from.

I can't believe we are bickering about what an algorithm is in a thread about the nature of consciousness. It's a very well defined term. Can we agree to it's standard definition? I like this one:

a set of rules that precisely defines a sequence of operations.

The internal operations of brain cells are algorithms executed by chemical reactions, as are the operations of the connections between brain cells. Trillions of these working at the same time can be fairly accepted as a humongously complicated algorithm in which consciousness emerges.

If you disagree, explain how it can't.
 
I was wrong about that. Anticipating the future is helpful, but not essential for survival, unless competing against creatures that can.

I was trying to say that anticipating the future was useful for survival, but probably not essential for consciousness to emerge.


Actually that is not necessarily the case. You do not have to be competing against others at all. If you can anticipate droughts or rain or stampeding herds or herd migration routes etc. etc. you would have a very good survival advantage all by itself without competing...well maybe you can call it competing with the elements.
 
I was wrong about that. Anticipating the future is helpful, but not essential for survival, unless competing against creatures that can.

I was trying to say that anticipating the future was useful for survival, but probably not essential for consciousness to emerge.
It probably depends on what you want to consider consciousness, and what you want to consider predicting the future. At some level, predicting the future is a key element of volition.

In particular, every act of volition begins with a plan based on a prediction of what would happen when we initiate certain behaviors. During the act, the results of the behaviors are monitored and compared to the predictions--if they match, we get a sense of control over an action. If they mismatch too much, we get a sense of a lack of control.

If you consider this aspect critical for consciousness, what remains is whether you consider the predictions inherent in volition to be prediction of future actions (we could draw a line solely on some pragmatic level and call such things "predicting the present", though it's not technically true).

I also suspect along the same lines that these models, when extended into the world (in terms of not only what we can move, but what we can touch and manipulate; and projections of a similar kind) form a basis for intentionality.
 
I can't believe we are bickering about what an algorithm is in a thread about the nature of consciousness. It's a very well defined term. Can we agree to it's standard definition? I like this one:



The internal operations of brain cells are algorithms executed by chemical reactions, as are the operations of the connections between brain cells. Trillions of these working at the same time can be fairly accepted as a humongously complicated algorithm in which consciousness emerges.

If you disagree, explain how it can't.


Animals do NOT have any inbuilt algorithms..... algorithms SIMULATE what animals do.... but animal behavior is not due to any algorithms. It is due to perceptions, actuations, reactions, and feedback with memory.

Look at this flow chart



I think this pdf might explain a lot about what I mean. Read sections 16.1 to 16.3 inclusive.

ETA: The language talks about algorithms and control logic, but that is because the final aim of the chapter is to create a robot that implements the system. However if you consider what actually happens you will see that it is not algorithms per se. Also the blocks that say "control and comparison logic" should really say neuron reactions but it is worded that way to make it RELATABLE to computer programming so as to actually IMPLEMENT the whole thing on a computer.
 
Last edited:
And the distinction is?


If you know how an old fashioned analog TV/Record player worked and how a new Digital TV/MP3 player works then the distinction is like that.

In Analog TV's there were circuitry to receive and filter and tune and amplify and change magnetic fields to move electron beams and so forth.

In a digital TV there are receivers to receive and tune but from that point onwards everything is an algorithm run by a computer (or a few) to simulate filters and amplifiers and so on. The output is still analog of course.

In other words in an Analog system there was no COMPUTING going on… there were no procedures and steps. Things worked in response to levels of voltages and currents. Everything worked together for an overall effect. There is no CENTRAL processing system.

A digital system uses a CENTRAL PROCESSING UNIT (CPU) which has an inbuilt procedure (algorithm).


Animal brains and bodies are more akin to an analog TV than they are to a digital TV. There is no CPU.... you may want to argue that the brain is a CPU....but it is not... it is very different from a CPU and very different even from the old analog computers.

If you know what a Neural Network is then that is the closest thing we have come to building anything like a brain..... but not quite even then…. Physical neural nets do not have algorithms unless each node is a CPU but then the CPU is just acting as a more adaptive way to create the analog node that could have been built from analog components but with less adaptability.
 
Why is that a problem?
It doesn't follow, consciousness may require something else.


How is it a non-sequitur?
It assumes the emergence of consciousness.
I accept that consciousness is not very well defined, however this should engender caution in making assumptions about consciousness.


Now that is a non-sequitur.
I have always been a fan of Star treck.

Imagine being the science officer on the Enterprise, (Picards Enterprise)
 
Status
Not open for further replies.

Back
Top Bottom