• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Explain consciousness to the layman.

Status
Not open for further replies.
... looks like we will soon be able to read minds, which would mean no longer is my internal behaviour private, now it is just one more behaviour that can be observed by others. It's behaviour all the way down now!

The Economist reported something similar but it was with images and the electrodes weren't even in direct contact with the brain. A person looking at a picture would broadcast an eerie simulacrum in a signal that could be picked up outside the skull. I think the electrodes were in contact with the scalp but I'm not positive.

I wonder if people could obscure their thoughts as a workaround, though. "Jesus is Lord. Not."
 
What does it physically mean to "perform an algorithm"?


Not very long ago, a computer was a person who computed. S/he followed a sequence of instructions, performing elementary calculations and, perhaps, looking up logarithms in tables, until s/he reached the end of their instructions.

S/he performed a specified algorithm to calculate the value(s) that were desired from the given input.

Unbelievably tedius and error-prone work.

We use machines for that now (for the most part).

(You supposedly have a background in programming - how could you possibly not know this?)
 
Last edited:
Look at this news report that I posted above: http://www.bbc.co.uk/news/science-environment-16811042 , looks like we will soon be able to read minds, which would mean no longer is my internal behaviour private, now it is just one more behaviour that can be observed by others. It's behaviour all the way down now!

Not just others, but also yourself.

I think once people can sit at a monitor and look at screen displaying *exactly* what their neurons are doing when they are sitting looking at a screen displaying *exactly* what their neurons are doing, they might finally be enlightened.
 
Arguing about something that is not defined would seem to me to be even more pointless.

Well, you can say that it's something that isn't even defined, so there's no point in arguing about it. Or, you can argue that the basis of consciousness is now well understood, and that it's a matter of a particular type of computer programming.

I don't see how it's possible to maintain both positions, though.
 
Not very long ago, a computer was a person who computed. S/he followed a sequence of instructions, performing elementary calculations and, perhaps, looking up logarithms in tables, until s/he reached the end of their instructions.

S/he performed a specified algorithm to calculate the value(s) that were desired from the given input.

Unbelievably tedius and error-prone work.

We use machines for that now (for the most part).

That's an excellent utilitarian description. It's obviously anthropocentric. Algorithms are something we use. Performing an algorithm is something human beings do.

For an algorithm to be meaningful as a physical action, seperately from human concerns - something that can be objectively described without reference to human wishes and intentions - it has to be specified in physical terms.

If we had to describe electricity as something we use to turn lights on so we can read at nighttime, it would be true, but obviously wouldn't constitute a physical description of electricity. A physical description would describe electricity in terms of how physical quantities interact, rather than its usefulness.

(You supposedly have a background in programming - how could you possibly not know this?)

I don't "supposedly" have a background in programming. I've demonstrated my understanding of the concepts. So if I am questioning the meaning of a very well understood term like "algorithm", it might be worth a second glance at what I'm saying to try to figure out what I'm getting at. That second glance would reveal the word "physically", which is the critical element of the question.

I hate to downplay my own importance, but my own background, beliefs and personality are really not very significant in the context of the possible creation of artificial consciousness. I don't know why they get more attention.
 
Last edited:
Well, you can say that it's something that isn't even defined, so there's no point in arguing about it. Or, you can argue that the basis of consciousness is now well understood, and that it's a matter of a particular type of computer programming.

I don't see how it's possible to maintain both positions, though.

Easy: the basics of consciousness are well understood, but we're still arguing semantics.
 
That's an excellent utilitarian description. It's obviously anthropocentric. Algorithms are something we use. Performing an algorithm is something human beings do.

For an algorithm to be meaningful as a physical action, seperately from human concerns - something that can be objectively described without reference to human wishes and intentions - it has to be specified in physical terms.

If we had to describe electricity as something we use to turn lights on so we can read at nighttime, it would be true, but obviously wouldn't constitute a physical description of electricity. A physical description would describe electricity in terms of how physical quantities interact, rather than its usefulness.

An algorithm's physical description is "a series of steps in a process."

I suppose you now want a physical description of "a," "series," "steps," "in," and "process?"

I hate to downplay my own importance, but my own background, beliefs and personality are really not very significant in the context of the possible creation of artificial consciousness. I don't know why they get more attention.

They are significant, however, in the context of this discussion.

Because if we are trying to explain things to people genuinely interested, and you come in here and vomit your dualist nonsense whilst also claiming some authority on the matter --because you have done "programming" in your past -- we are gonna do whatever we can to defend the integrity of the discussion. Up to, and including, demonstrating that your supposed authority seems to the rest of us to not be what you claim it to be.
 
Actually, no. It's something computers do. Human brains are also computers.

Then there needs to be a definition of an algorithm that doesn't reference human beings and their intentions and needs.
 
Everybody on this forum seems to be an expert on me.

It isn't hard. We can read, and we have memory.

For instance, I haven't forgotten all of these positions of yours:

-- running is not a physical process, because there is no physical description of running.

-- a human being assembled from scratch in a laboratory would not be conscious, even if it functioned normally.

-- if we are living in a simulation, we are not actually conscious.

-- something is not a physical description for X unless it exists in a physics textbook.

-- the processes going on in bacteria that lead them to behave differently from rocks can only be observed by intelligent beings.

-- from the point of view of a non-intelligent being, a bowl of soup behaves the same as a computer.

If I got any of those wrong, please let me know ( oh, and don't forget to explain how I got them wrong ).
 
Then there needs to be a definition of an algorithm that doesn't reference human beings and their intentions and needs.

How about "a series of steps in a process?"

Do you disagree that any number of the step-by-step reactions that occur in bacteria, for example, that allow them to behave as a living organism could be considered algorithms?

I am not sure what else you would call the cellular control mechanisms displayed by life, if not an algorithm.
 
That's really nothing but a circular argument by way of definition. If you define a robot is "not alive" and then define conciousness (in part) as "something that living things do" then a robot cannot be concious but only because you have defined it that way.
I was not making an argument, I was illustrating my point.

I agree with your observations about definitions, however I see little need for precise definitions here. Its a simple observation in consideration of the evolutionary emergence of consciousness.

However in the real world what you are doing is merely asserting your conclusion.
Life emerged from inanimate matter with features (behaviors), which developed into consciousness later on. These features were from the beginning what distinguished it from that inanimate matter.

One could even consider that life and consciousness are one and the same thing

Now if in the future humanity produced a living machine, I would expect it to exhibit embryonic consciousness and develop in time into a consciousness rather like that experienced by a human.

I see no reason to consider that inanimate machines or computers can be conscious in any sense, the best they can do is mimic it.
 
Last edited:
No, I don't see your point and you still don't understand mine.

You say consciousness requires biological life because... well because that the type of consciousness we know already.

But before we built machines which could have locomotion, only biological life could have locomotion, so your logic is flawed.

I'm not using logic. I'm pointing out that it appears that living is a requirement for consciousness and that inanimate computation while operating in a similar manner is not conscious in the same way. It is always a toaster.
 
Then there needs to be a definition of an algorithm that doesn't reference human beings and their intentions and needs.


There are such definitions of 'algorithm'.

(I thought you were supposed to have some background in programming.)
 
Status
Not open for further replies.

Back
Top Bottom