My take on why indeed the study of consciousness may not be as simple

It's in a previous post, but I'll go over it again.

A is run for 15 simulated seconds. I can build an equivalent machine, A', out of NAND gates, and run an equivalent program for 15 simulated seconds.
Right.

During the run of A', I can record the inputs to each NAND gate, which I have numbered, at each step in the process.
Right.

I create four ordered partitions of each of these sets--one for each of the four possible inputs of the NAND gate: (00), (01), (10), and (11) (I go in numbered order first, then in time order).
Okay, so you've scrambled the ordering of the inputs.

Next, I'm going to verify that the outputs matched by desk checking each NAND gate's computation. I check in this order--I run a check of the first calculation in partition (00), then the first in (01), then the first in (10), then the first in (11), and then proceed to the next in (00), next in (01), and so forth.
Right. So you know that each NAND gate works.

So to perform this check, I have to calculate (00), (01), (10), (11), and repeat. If I reach the end of a partition I could fill it with dummy calculations, and proceed in this order.
No you can't. But that's a side issue.

This particular check I can delegate to a four state machine that calculates (00), then (01), then (10), then (11), then repeats. That's what N is.
Right.

N strictly follows the rule that it calculates everything that A' did, and it takes a lot of leeway in doing so. N is built this way specifically to be able to say "no that's not exactly what I mean", or, "yes, that works". That's all.
Okay.

Then while my take on exactly what N was doing was wrong, at least part of my reply still applies: You have a set of data that doesn't mean anything. Unless you have a mapping that can put it back in the right order, it doesn't say anything at all. With the set of data and the mapping, you have consciousness. With the set of data and the mapping, with N and the mapping, you can interject and ask the consciousness a question.

Without the mapping, you have simply randomised the data. In effect you are asking "If I stick your brain in a blender and puree it, is it still you afterwards?" Unless you have an unblender that can unpuree the brain, the answer is no. But if you do have an unblender, then the puree'd brain plus the unblender is still me.
 
That works! That's exactly what I'm looking for.

But this leads to another question. I have this mapping, but the mapping is just sitting there. Now do I have to run N to implement consciousness? Or is there a different thing being argued?
Well, it depends on what you want it to do. Consciousness is a verb, remember. Unless it's doing something, it doesn't actually exist. And the only way to instantiate it is via the mapping.

(ETA: Oh, and also, does the mapping have to be physically represented? Is it enough that it "exists" in an abstract, mathematical sense of the word? If I have no map and run an N, I don't know how to get useful results out, but would you still say that there's a scrambled consciousness in the calculations?)
If there is at least some well-defined way to generate the mapping, then there is some well-defined way to generate the consciousness. As I said just above, consciousness is a verb, it exists in doing. If it can in practice be re-run, then it in practice can exist; if it in principle can be re-run, then it in principle can exist. If the mapping is just gone, then it's just gone.

Again, the goal isn't to debate--it's an attempt to, let's say, do a "breadth first" slice through a lot of ambiguity. It's just a groundwork to get a more precise statement out is all.
Yep, I'm with you now.
 
In effect you are asking "If I stick your brain in a blender and puree it, is it still you afterwards?" Unless you have an unblender that can unpuree the brain, the answer is no. But if you do have an unblender, then the puree'd brain plus the unblender is still me.
This is not to say that I would have no objections to the process. ;)
 
Ah, a correction. N doesn't technically randomize things... it actually sorts them.

So it's not at all like putting your brain in a blender, so you don't have to worry about that. It's more like putting your brain in a centrifuge.

But I'll assume your objections to the process probably remain nonetheless.
 
What part of our physiology are you talking about?

Endodermal, mesodermal, and exodermal tissues -- all of it. If its alive its performing SRIPs.

If you mean the brain, then yes, the brain performs self-referential information processing when you are asleep.

Hence my objection to your usage of the term. What you're calling "consciousness" isn't whats being discussed when most other English speakers use the word.

That means that you are conscious when you are asleep.

If that's your objection, then it's simply equivocation. Conscious vs. unconscious in the sense of aware vs. unaware is simply not the same as awake vs. asleep.

The logical conclusion of your tautology is that individuals are never unconscious.


No, not at all. [Homeostasis is] just a feedback loop. The feedeback loop doesn't reference itself.

One thermostat is not conscious. Two thermostats, though, could be.

And an organism is made up of up to trillions of such modules. All of them processing information within and between each other, collectively regulating and modifying their behavior and development. The entire system is inherently self-referential.

No, you're just failing to grasp the concept.

I've grasped the concept from the get go. Its not at all difficult to understand. The problem is that you're refusing to see that what you're calling consciousness is not the phenomenon being discussed.

AkuManiMani said:
So you -don't- experience sensations with distinctive qualities?

No. I experience sensations with distinctive quantities. Only this, and nothing more.

:jaw-dropp

If thats true then you must be of a radically different make up than other humans, PixyMisa.

And qualia are still baloney.

[...]

No, qualia are baloney. Sensations are quantitative.

Simply amazing...

:popcorn1

We don't need to know how the brain works to simulate it.

Please tell me that you're just trolling...


It is a mathematical and physical fact that the brain can be simulated, regardless of how it works, and that a conscious mind will result from the simulation.

Assuming your special definition of the word "consciousness", sure.


First, no, you have completely failed to grasp what self-reference means. (How this is possible given the number of times it has been explained I have no idea.)

I dunno, Pixy. Maybe I'm just shtoopid :rolleyes:

You are talking about simple reference - which does happen throughout biological systems, even at the cellular level.

An individual cell's systems of genetic expression and epigenetic regulation alone meet the operational criteria of a SRIP. This isn't even taking into account other systems within the cell linked to these regulatory processes, or the higher level systems of regulation they're tied to in multicellular critters.

Second, no, we aren't unconscious when we are asleep or even under general anaesthesia. We still have conscious processes going on in our brains until we're brain-dead - until the cells have actually started to die en masse.

As is already blatantly obvious to anyone who isn't brain dead, you and I don't mean the same thing when we say "conscious". Did it ever cross your mind that perhaps it is you whos failing to grasp something crucial?

AkuManiMani said:
Constantly repeating that consciousness = SRIP does not make it any more true, and appealing to Hofstadter as if his words are holy writ does not make your position any more sound. Just drop it, dude.

If you could just provide me with one counterexample, that would have a lot more weight than all your comprehensive failures of understanding to date.

Being directly aware of any stimuli external to ourselves.


If you were raising the simulated oranges thing - water and electricity in your case - as if it were a coherent objection, you're not even playing the game. You're not even on the same continent that the game is being played.

Judging from some of your above statements, I'm not even certain you're operating in the same universe as the rest of us, Pixy...
 
Last edited:
Endodermal, mesodermal, and exodermal tissues -- all of it. If its alive its performing SRIPs.
Please give me an example - a specific example - noting that all of your previous examples have been wrong.

Hence my objection to your usage of the term. What you're calling "consciousness" isn't whats being discussed when most other English speakers use the word.
In fact, it is precisely what's being discussed; rather, it's that most people (like you) don't know what's happening and how it happens.

The logical conclusion of your tautology is that individuals are never unconscious.
Well, duh. If they're unconscious by that definition, they're not individuals, they're corpses.

And an organism is made up of up to trillions of such modules. All of them processing information within and between each other, collectively regulating and modifying their behavior and development.
Yes.

The entire system is inherently self-referential.
No. You still completely fail to understand what self-reference is. Read Hofstadter. He takes 600 pages to explain it, from many different angles. I've already explained it here repeatedly; if you haven't grasped it by now, you're not going to understand it from a forum post.

Read Godel, Escher, Bach, and then come back. It covers not only self-reference but other key concepts like the Church-Turing thesis and Godel's Incompleteness Theorem.

I've grasped the concept from the get go.
You clearly haven't, because the examples you propose do not constitute examples of self-referential information processing. Not one of them.

Its not at all difficult to understand.
Then why do you keep getting it wrong? In every example, you mistake simple reference for self-reference.

The problem is that you're refusing to see that what you're calling consciousness is not the phenomenon being discussed.
You keep getting that wrong, too, of course.

That definition you so clumsily avoided? The very first definition?

1. the state of being conscious; awareness of one's own existence, sensations, thoughts, surroundings, etc.
Awareness of one's thoughts, eh? Thinking is information processing. Thinking about your thoughts is self-referential information processing.

If thats true then you must be of a radically different make up than other humans, PixyMisa.
Nope. I just acknowledge what's going on.

Simply amazing...
No coherent response, then?

Please tell me that you're just trolling...
Nope. It's simply true. A large part of the reason to run simulations is to work out how the process being simulated actually works. You need to know at some level how the components behave, and how they are organised. You program that in and press enter.

Assuming your special definition of the word "consciousness", sure.
You still haven't explained how my definition is supposed to be different from any other definition that actually matches what happens. I acknowledge that it is different from fairy-tale definitions that people like Chalmers and Jackson like to present.

I dunno, Pixy. Maybe I'm just shtoopid
I don't think you are. Read Godel, Escher, Bach. Really.

An individual cell's systems of genetic expression and epigenetic regulation alone meet the operational criteria of a SRIP.
There's reference there. There's computation, yes. That is not enough.

This isn't even taking into account other systems within the cell linked to these regulatory processes, or the higher level systems of regulation they're tied to in multicellular critters.
Multiple levels of regulation does not constitute self-reference.

As is already blatantly obvious to anyone who isn't brain dead, you and I don't mean the same thing when we say "conscious".
Yes we do. You apparently haven't examined the implications.

The word in the English language already has multiple meanings. That's why psychologists tend to use different and more specific terms like arousal and attention. I'm just pointing this out and providing a more specific (but entirely corresponding) definition for one of those meanings.

Did it ever cross your mind that perhaps it is you whos failing to grasp something crucial?
Oh, sure, many times.

Never in talking to you, though.

Being directly aware of any stimuli external to ourselves.
That's supposed to be a counter-example?

First, it's important to note that you are not "directly aware" of any external stimuli. Ever. That's simply not how things work. Everything is mapped through multiple levels of abstraction.

Now, if you're talking about simply responding to the stimulus - blink at a sudden bright light - that needn't involve consciousness at all. (At least, not the consciousness that is you; it may well be that the prestriate cortex forms an autonomous consciousness in its processing of visual perception, for example. Of course, you would have no way to tell directly, but we could work this out by simulating it.)

So what level are you talking about? Stimulus? Association? Attention?

Judging from some of your above statements, I'm not even certain you're operating in the same universe as the rest of us, Pixy...
Yes, we know you have that problem. Try to get over it.
 
Ah, a correction. N doesn't technically randomize things... it actually sorts them.
Well, sorting is a sort of randomness. ;)

So it's not at all like putting your brain in a blender, so you don't have to worry about that. It's more like putting your brain in a centrifuge.

But I'll assume your objections to the process probably remain nonetheless.
Yep. Even if you have an uncentrifuge!
 
Who says behavior has to be overt? Think of consciousness as internal behavior. What problems arise that would make you think it is something more?
I think you have to define "behaviour" then.

If you mean "stuff that happens", well of course it is stuff that happens.

But as I said there is a lot of stuff that happens in your brain and only a tiny proportion of stuff you know about in the normal course of the day.

And yet you feel no need to differentiate between them?
 
Last edited:
Well, keep in mind that entire worlds have been created that are nothing more than bits on silicon.

If you suggested such a thing to someone who hadn't seen it, they would say "but those are just bits on silicon..."

But then their jaw will drop when they see something like GTA4, which fits neatly on an Xbox360.
But there is more than bits on silicon, there is some complex hardware to translate those bits into images, sounds, movement etc and there is the complex hardware of our eyes and ears and our brains. A lot more than just bits on silicon.

But for the desk checked algorthm there is no mechanism whatsoever.

I write down 2+2=4 on a piece of paper.

The marks mean nothing except to an intelligent being who understands them.

The brain activity means nothing except to me and is gone in an instant.

I write down 4+3=7

The universe does not make a connection between this and the other marks on the paper, only I can make the connection.

The universe does not save the brain state or relate it to previous brain states, which would be meaningless outside my head in any case.

So there is a mechanism for calculating a set of numbers and that is all and those number are not meaningful except to an intelligent being who can interpret them.

But there is no mechanism for producing conscious states, to the universe these are just a set of random events.

This is not just my intuition any more, it is a fact that there is no known or conceivable mechanism to connect these events and the events themselves - the brain events, the marks on paper are as good as random except to an intelligent being.

Again - what is the mechanism you are suggesting?
 
Last edited:
But it is not a proof that all algorithms are Turing compatible. So the brain could employ an algorithm that is not Turing compatible.
No, when the brain is using an algorithm then that algorithm is Turing compatible. But that does not entail that everything the brain does is an algorithm.

An algorithm is a mathematical entity involving natural numbers and must be equivalent to a function on natural numbers.

So anything that has random numbers is not, by definition, an algorithm. Any process that involves non-discrete values is not an algorithm, by definition

Nevertheless a system that has some randomness or a system that has non-discrete values can implement an algorithm.

It can't do any computation on natural numbers that a universal turing machine can't also do.

But that is not to say that it is intrinsically algorithmic.

The bottom line is that just because a system can run a Turing Machines does not imply that the system is a Turing Machine.

So the underlying premiss of all this is just wrong.
 
Correct. That is why we keep asking for a coherent description of how the brain might not be algorithmic.

~~ Paul
Shifting burden of proof

It should be up to the proposer do demonstrate that it is algorithmic.

Can you state a brain as a function on natural numbers?

As I said something that involves non-discrete values is not an algorithm, nor is something that has genuine randomness.

The brain may have no randomness, but do you think that a simulation of a human brain might involve using approximations of real numbers? If so then it is not equivalent to the thing it is simulating.

It has never been demonstrated that a system where stuff happens simultaneously is equivalent to an algorithm.

Stuff happening at the same time or overlapping is not equivalent to a set of discrete steps followed by the incrementing of a time variable.

So there are lots of ways the brain could not be algorithmic - and I can't really see any way that it could be algorithmic, only that it can implement an algorithm.

So can you state the brain as a function on natural numbers?

Can you even suggest how it might be possible to state the brain as a function on natural numbers?
 
Last edited:
It allows it to be non-deterministic, which a Turing machine can't do. Hence it can do things a Turing machine can't, and hence it is more powerful.
Well "more powerful" was always a red herring. It does not matter that it is not more powerful.

The premiss of all this is that if a system can implement a Turing Machine then there is a Turing Machine that is equivalent to it.

If there is even one counter example then the premiss fails.

And a system that contains some real randomness can implement a Turing Machine, so we have the counter example and the premiss fails.

A system that contains non-discrete values is not a Turing Machine and can implement a Turing Machine.

It is highly questionable whether a system that has a number of simultaneous or overlapping and communicating events can be a Turing Machine either, but it can certainly implement one.
 
Last edited:

Back
Top Bottom