• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

My take on why indeed the study of consciousness may not be as simple

Nope. That's the "simulated orange" category error Pixy mentioned.
Thank you Urchfont.

"category error" is certainly the buzzword around here now isn't it?

But the actual case is that you are making the category error and we are pointing it out to you.
 
Robin said:
But your brain is doing lots of stuff that you are not aware of. That is also behaviour.

So what is the difference between the stuff you don't get to know about and pain?
It's just different kinds of behavior. The awareness itself is a particular sort of behavior.

I think you have to define "behaviour" then.

If you mean "stuff that happens", well of course it is stuff that happens.
It means stuff that happens as a result of stimuli.

But as I said there is a lot of stuff that happens in your brain and only a tiny proportion of stuff you know about in the normal course of the day.

And yet you feel no need to differentiate between them?
I am differentiating, just as I'm differentiating between various forms of overt behavior.

The point of calling consciousness a form of behavior is to try to eliminate the magical quality that always accompanies it. The running mental dialog that I have with myself is a lot like speech, only internal. The images I have are a lot like vision. Mirroring other people's reactions is a lot like noticing my own actions. I think if you break down consciousness it's just a lot of external behaviors replayed while suppressing external behavior.

~~ Paul
 
The conclusion. How do you know that conscious experience isn't simply a matter of processing information?

~~ Paul
Speaking for myself I don't know that it isn't and I don't know that it is. There is no killer argument either way. But I know of no reason why the mere processing of information should result in what I experience as a sensation.

And as I asked above about the desk checked algorithm - what is the mechanism that binds these events together?
 
Last edited:
I think you have to define "behaviour" then.

If you mean "stuff that happens", well of course it is stuff that happens.

But as I said there is a lot of stuff that happens in your brain and only a tiny proportion of stuff you know about in the normal course of the day.

And yet you feel no need to differentiate between them?


Just to interject, this the usage in the psychology of behaviorismWP. The most modern champion being B. F. Skinner. Just as there are external visible behaviors there are private behaviors which are not directly visible (dependant on technology) , and then there might possibly be nehaviors which are 'aware' and 'unawre'. The whole point being you have to define 'aware'.
 
Robin said:
No, when the brain is using an algorithm then that algorithm is Turing compatible. But that does not entail that everything the brain does is an algorithm.
If the Church-Turing thesis is correct, yes. I agree with your second sentence.

So anything that has random numbers is not, by definition, an algorithm. Any process that involves non-discrete values is not an algorithm, by definition.
The concensus appears to be that a Turing machine with a random number generator is no more powerful than a regular Turing machine. However, it has not been proven.

Nevertheless a system that has some randomness or a system that has non-discrete values can implement an algorithm.

It can't do any computation on natural numbers that a universal turing machine can't also do.

But that is not to say that it is intrinsically algorithmic.
What do you think it can do that a Turing machine can't do?

Shifting burden of proof

It should be up to the proposer do demonstrate that it is algorithmic.

Can you state a brain as a function on natural numbers?

As I said something that involves non-discrete values is not an algorithm, nor is something that has genuine randomness.
See http://www.internationalskeptics.com/forums/showthread.php?postid=5379231#post5379231. It appears that randomness does not confer additional power.

The brain may have no randomness, but do you think that a simulation of a human brain might involve using approximations of real numbers? If so then it is not equivalent to the thing it is simulating.
I don't understand this point. Computers use approximations of real numbers all the time. They are Turing compatible. You don't think the brain needs noncomputable reals to work, do you?

It has never been demonstrated that a system where stuff happens simultaneously is equivalent to an algorithm.

Stuff happening at the same time or overlapping is not equivalent to a set of discrete steps followed by the incrementing of a time variable.
I have no idea what parallel processing does to the Turing machine model.

So there are lots of ways the brain could not be algorithmic - and I can't really see any way that it could be algorithmic, only that it can implement an algorithm.

So can you state the brain as a function on natural numbers?

Can you even suggest how it might be possible to state the brain as a function on natural numbers?
Nope, can't do that.

All I'm asking is for someone who thinks that there might be a super-computational aspect to the brain to give a coherent description of how it might work or how the brain might employ it. Until that happens, I see no reason to believe there might be such a thing, because that would require me to believe in something magical without justification.

In the meantime, I agree that assuming the brain is fully algorithmic is not completely justified.

~~ Paul
 
Last edited:
It's just different kinds of behavior. The awareness itself is a particular sort of behavior.
So is everything else in the universe.
It means stuff that happens as a result of stimuli.
Which again pretty much involves everything in the universe.
I am differentiating, just as I'm differentiating between various forms of overt behavior.
What do you mean overt?
The point of calling consciousness a form of behavior is to try to eliminate the magical quality that always accompanies it.
But I don't think that that anybody here is calling it magical.
The running mental dialog that I have with myself is a lot like speech, only internal.
Inside what?
The images I have are a lot like vision. Mirroring other people's reactions is a lot like noticing my own actions. I think if you break down consciousness it's just a lot of external behaviors replayed while suppressing external behavior.
Again, external to what? Your mind?
 
Robin said:
Speaking for myself I don't know that it isn't and I don't know that it is. There is no killer argument either way. But I know of no reason why the mere processing of information should result in what I experience as a sensation.
Because the processing is going on inside your head. But I understand your point; it's not intuitively obvious.

And as I asked above about the desk checked algorithm - what is the mechanism that binds these events together?
Sorry, I'm not paying attention to the desk checking conversation.

~~ Paul
 
Last edited:
Robin said:
What do you mean overt?
External observable behavior.

But I don't think that that anybody here is calling it magical.
Not explicitly, but people appear to be hanging their hats on the idea that there might be something nonalgorithmic going on. Yet no one can specify what that might be.

Inside what?
Inside my brain.

Again, external to what? Your mind?
External to my brain in the sense of being observable.

Consider dreaming. In order to prevent me from walking around and punching people in the nose while I dream, my muscle system has to be turned off. The overt behavior is suppressed and so only the private behavior occurs. Without that suppression it's all of a piece.

~~ Paul
 
Pain is the behaviour of everything?

Sound a bit zen.


Pain is simply a label we apply to certain behaviors of parts of the environment, we tend to apply it to some specific parts of the environment, for example parts that we label as human beings, to dogs and so on.
 
If the Church-Turing thesis is correct, yes.
A misunderstanding of the thesis. There is an algorithm U which does everything that an algorithm can and for any system S, be it algorithmic or otherwise, U can do any operation on natural numbers that S can. But it does not suggest that U can do anything that S can.
The concensus appears to be that a Turing machine with a random number generator is no more powerful than a regular Turing machine. However, it has not been proven.
So what? This "more powerful" thing is a complete red herring.

A system that uses non-discrete quantities is not an algorithm. A system that has some real randomness is not an algorithm

A system that uses non-discrete quantities can implement an algorithm.

A system that has some real randomness can implement an algorithm.
What do you think it can do that a Turing machine can't do?
Process non-discrete information, map a point on the domain to more than one point on the range.

Is every biochemical event in our brain equivalent to some natural number?
See http://www.internationalskeptics.com/forums/showthread.php?postid=5379231#post5379231. It appears that randomness does not confer additional power.
Again so what?

A: Some things are not light
B: No, nothing can go faster than light
A: I didn't say they could, I said some thing are not light
B: What do you think can go faster than light?
A: No, I said some things are not light
B: Do you think that not being light can make them go faster than light?
I don't understand this point. Computers use approximations of real numbers all the time. They are Turing compatible.
You are missing the point, do you think a natural number approximation of a real number is equivalent to the value it is approximating?
I have no idea what parallel processing does to the Turing machine model.
Neither do I, and the brain is not just an orderly computer style parallel processing either.
Nope, can't do that.
So why do you assume that you can?
All I'm asking is for someone who thinks that there might be a super-computational aspect to the brain to give a coherent description of how it might work or how the brain might employ it.
Why would that be relevant?
Until that happens, I see no reason to believe there might be such a thing, because that would require me to believe in something magical without justification.
Nobody is saying that there is a super-computational element in the brain, that is just drkitten's straw man.

The question is, can a system that is not an algorithm implement an algorithm?

Yes it can. It is not more powerful than a Turing Machine. It is just not a Turing Machine.

I don't understand why no-one will even acknowledge that I have said this.
In the meantime, I agree that assuming the brain is fully algorithmic is not completely justified.
ie assuming that your brain is a function on natural numbers, that there are no non-discrete processes going on in your brain.
 
Last edited:
Not explicitly, but people appear to be hanging their hats on the idea that there might be something nonalgorithmic going on.
Excuse me, are you saying that anything that is not algorithmic is necessarily magic?

In other words you are saying the Universe is one big Turing Machine and that there are no non-discrete quantities and no real randomness in the Universe????

Can someone who is making this "everything is an algorithm" claim please back it up? It sounds very Wolfram "New Kind of Science"

Why do you think that every other claim requires evidence except for yours?
Yet no one can specify what that might be.
Err... once more - non-discrete quantities, stuff happening simultaneously.
 
Last edited:
Pain is simply a label we apply to certain behaviors of parts of the environment, we tend to apply it to some specific parts of the environment, for example parts that we label as human beings, to dogs and so on.
But again, that is fine. It is something that could apply to anything in the universe.

What we are looking at is the properties of pain that might differentiate it from, say, the Thames.
 
AkuManiMani said:
Endodermal, mesodermal, and exodermal tissues -- all of it. If its alive its performing SRIPs.

Please give me an example - a specific example - noting that all of your previous examples have been wrong.

Lemme get this strait. You're honestly arguing that two thermostats in dynamic feedback with eachother are examples of SRIPs, but the collective functioning of an entire multicellular organism does not qualif -- pardon -- quantify as a self-referential system? Do you not even realize that the basis of you position is held together by special pleading?

AkuManiMani said:
Hence my objection to your usage of the term. What you're calling "consciousness" isn't whats being discussed when most other English speakers use the word.

In fact, it is precisely what's being discussed; rather, it's that most people (like you) don't know what's happening and how it happens.

Pixy, in an earlier post you stated that even while a person is a sleep, there are "conscious processes" occurring. You also stated that the difference between waking an sleeping states is being "aware" vs "unaware". You're clearly using the terminology differently than the common use.

The words "conscious" and "aware" are synonyms in common usage; hence being conscious of one's self and being aware of one's self mean the same thing. When a person is in deep sleep [or otherwise unconscious in common parlance] there is mental activity but they are not considered to be conscious. Unconscious mental activity is considered subliminal and, regardless of whether or not this activity is self-referential, it is still unconscious.


AkuManiMani said:
The logical conclusion of your tautology is that individuals are never unconscious.

Well, duh. If they're unconscious by that definition, they're not individuals, they're corpses.

The point is, that by your definition of consciousness, any organism, with or without a functioning nervous system, is conscious.


AkuManiMani said:
And an organism is made up of up to trillions of such modules. All of them processing information within and between each other, collectively regulating and modifying their behavior and development.

Yes.

AkuManiMani said:
The entire system is inherently self-referential.

No. You still completely fail to understand what self-reference is.

So an organism directing, regulating, and modifying it's own behavior, structure, composition, and development is not a self-referential process?

Read Hofstadter. He takes 600 pages to explain it, from many different angles. I've already explained it here repeatedly; if you haven't grasped it by now, you're not going to understand it from a forum post.

After roughly two years of discussions with you, I've already read atleast 600 pages worth of Hofstadter. All you do in these exchanges is mindlessly parrot his arguments and use it as a checklist of accuracy against any statement made.

Read Godel, Escher, Bach, and then come back. It covers not only self-reference but other key concepts like the Church-Turing thesis and Godel's Incompleteness Theorem.

As fascinating as the CT-thesis and Gödel's Theorem are, of what relevance are they to the ontological status of consciousness?

You clearly haven't, because the examples you propose do not constitute examples of self-referential information processing. Not one of them.

Authoritative special pleading does not an argument make.

Then why do you keep getting it wrong? In every example, you mistake simple reference for self-reference.

An organisms must reference itself in order to maintain and regulate itself. When an organism carries out these functions it must process information inherent to itself. A living organism is a system that references itself, in feedback with its environment. This is true regardless of whether or not it has a brain or nervous system.

You keep getting that wrong, too, of course.

Again, flatly asserting that someone is "wrong" is not an argument, PixyMisa.

That definition you so clumsily avoided? The very first definition?

You've already brought up this point and I've already addressed it.

Awareness of one's thoughts, eh? Thinking is information processing. Thinking about your thoughts is self-referential information processing.

So if someone is not thinking about their own thoughts they aren't conscious. Got it. :rolleyes:
 
AkuManiMani said:
Simply amazing...

No coherent response, then?

I just find your capacity to flatly deny self-evident facts dumbfounding.

Nope. It's simply true. A large part of the reason to run simulations is to work out how the process being simulated actually works. You need to know at some level how the components behave, and how they are organised. You program that in and press enter.

In other words, you have to have some basic knowledge of how the system works to begin with. So you weren't trolling; you just misspoke. Gotcha.

You still haven't explained how my definition is supposed to be different from any other definition that actually matches what happens. I acknowledge that it is different from fairy-tale definitions that people like Chalmers and Jackson like to present.

Normal functioning humans experience the world as having distinctive qualities. Based off of your previous comments, you apparently have no personal experience of this so its impossible for you to grock the concept. Oh, well...

/shrug


I don't think you are. Read Godel, Escher, Bach. Really.

I'll grab a copy over winter break then :)

There's reference there. There's computation, yes. That is not enough.

Multiple levels of regulation does not constitute self-reference.

Multiple levels of self-regulation does.

Yes we do. You apparently haven't examined the implications.

The word in the English language already has multiple meanings. That's why psychologists tend to use different and more specific terms like arousal and attention. I'm just pointing this out and providing a more specific (but entirely corresponding) definition for one of those meanings.

Based off of some of your earlier statements you really aren't referring to the same phenomenon. You flatly stated that you do not have any qualitative experience. I do.

AkuManiMani said:
Being directly aware of any stimuli external to ourselves.

That's supposed to be a counter-example?

First, it's important to note that you are not "directly aware" of any external stimuli. Ever. That's simply not how things work. Everything is mapped through multiple levels of abstraction.

Regardless of how filtered our senses are, any external stimulus that we become aware of is by definition not self-referential. Its in reference to an external object.


Now, if you're talking about simply responding to the stimulus - blink at a sudden bright light - that needn't involve consciousness at all. (At least, not the consciousness that is you; it may well be that the prestriate cortex forms an autonomous consciousness in its processing of visual perception, for example.

Thats exactly my point. What you're calling "autonomous consciousness" is an example of an unconscious mental process -- just as the vast majority of the subsystems within our body are unconscious.

Of course, you would have no way to tell directly, but we could work this out by simulating it.)

I think most would agree that neuroscience is best conducted by studying actual brains. Without an understanding of actual brain function there is no valid basis upon which to even draw a valid model for simulation.

So what level are you talking about? Stimulus? Association? Attention?

The "consciousness that is you", as you put it.

Yes, we know you have that problem. Try to get over it.

What can I say? I'm a sucker for inane discussions with mindless chat bots. We all have our quirks :p
 
Last edited:
A misunderstanding of the thesis. There is an algorithm U which does everything that an algorithm can and for any system S, be it algorithmic or otherwise, U can do any operation on natural numbers that S can. But it does not suggest that U can do anything that S can.
Sorry, you missed my edit. If the Church-Turing thesis is correct, then any algorithm is Turing compatible. I agree that not everything need be an algorithm.

So what? This "more powerful" thing is a complete red herring.

A system that uses non-discrete quantities is not an algorithm. A system that has some real randomness is not an algorithm

A system that uses non-discrete quantities can implement an algorithm.

A system that has some real randomness can implement an algorithm.
A Turing machine with an RNG is BPP, and it appears to be the case that BPP = P. If this is the case, what can the TM+RNG do that the TM alone cannot?

Process non-discrete information, map a point on the domain to more than one point on the range.

Is every biochemical event in our brain equivalent to some natural number?
That is the question.

Again so what?

A: Some things are not light
B: No, nothing can go faster than light
A: I didn't say they could, I said some thing are not light
B: What do you think can go faster than light?
A: No, I said some things are not light
B: Do you think that not being light can make them go faster than light?
You don't need to keep teaching me this. What do you think about the TM+RNG? Is the RNG a clincher?

You are missing the point, do you think a natural number approximation of a real number is equivalent to the value it is approximating?
No. Do you think that the brain needs more than just an approximation of any noncomputable reals?

So why do you assume that you can?
Because I haven't heard any coherent suggestion for what else might be necessary.

Nobody is saying that there is a super-computational element in the brain, that is just drkitten's straw man.

The question is, can a system that is not an algorithm implement an algorithm?

Yes it can. It is not more powerful than a Turing Machine. It is just not a Turing Machine.
What? Are you suggesting that there is something that is no more powerful than a Turing machine yet is not compatible with one?

ie assuming that your brain is a function on natural numbers, that there are no non-discrete processes going on in your brain.
Even if there are, is this a deep problem? Any nondiscrete process can be performed to any degree of accuracy by a Turing machine.

I have no vested interest in the brain being a Turing machine. Certainly the haphazard parallel processing might be an issue. I'm just waiting for a compelling argument that the brain can't be a Turing machine.

Here is a paper on this very subject:

http://74.125.93.132/search?q=cache...ss"+"turing+machine"&cd=1&hl=en&ct=clnk&gl=us

~~ Paul
 
Robin said:
Excuse me, are you saying that anything that is not algorithmic is necessarily magic?

In other words you are saying the Universe is one big Turing Machine and that there are no non-discrete quantities and no real randomness in the Universe????
No.

Why do you think that every other claim requires evidence except for yours?
I don't. As I said above, the burden of proof is on both sides.

Err... once more - non-discrete quantities, stuff happening simultaneously.
Possibly, yes. I'm just looking for a little more detail.

But what about all the stuff that happens inside your brain that you are not conscious of?

Do you call that internal or external?
Internal behavior that I'm not aware of. There is also external behavior that I'm not aware of.

~~ Paul
 
The conclusion. How do you know that conscious experience isn't simply a matter of processing information?

~~ Paul

I guess the simplest way to put it is this: Information processing is ubiquitous throughout our physiology and almost all of it is unconscious. Therefore the defining feature of consciousness is not information processing. Its the active capacity to be aware of information as having subjective qualities.

To clarify, I'm not denying that we are processing information when we are conscious; I'm saying the difference between conscious and unconscious processing is a difference of -kind- rather than -degree-.
 
AkuManiMani said:
I guess the simplest way to put it is this: Information processing is ubiquitous throughout our physiology and almost all of it is unconscious. Therefore the defining feature of consciousness is not information processing. Its the active capacity to be aware of information as having subjective qualities.
Which may be just additional information processing.

To clarify, I'm not denying that we are processing information when we are conscious; I'm saying the difference between conscious and unconscious processing is a difference of -kind- rather than -degree-.
Yes, a difference in the kind of information processing. :D

~~ Paul
 

Back
Top Bottom