• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

My take on why indeed the study of consciousness may not be as simple

We also can't observe another's consciousness. We rely solely on our own consciousness to tell us what consciousness is.

Behaviourists would claim that, actually, we rely solely on our observations of other's to tell us what our consciousness is. Or something close to that.

ETA: In fact, I seem to remember that things were pretty murky until I was about four... almost as though my "consciousness" wasn't fully formed until then.

A chess machine doesn't think at all.

Really ? What's your definition of "to think" ?
 
Last edited:
I simply don't see "intelligence" as being relevant when dealing with computers. They are tools which can work well or badly. A hammer that is well balanced reflects intelligence every bit as much as Gary Kasparov's Grandmaster Chess.

Uh-huh. But so do humans. One could argue that we are tools, too. Intelligence is infered from behaviour.

I regard "intelligence" as something not possessed by inanimate objects.

Of course you do, since this is a circular definition for you.
 
Well, what is your definition? Pick one of the following:

1 : to form or have in the mind
2 : to have as an intention <thought to return early>
3 a : to have as an opinion <think it's so> b : to regard as : consider <think the rule unfair>
4 a : to reflect on : ponder <think the matter over> b : to determine by reflecting <think what to do next>
5 : to call to mind : remember <he never thinks to ask how we do>
6 : to devise by thinking —usually used with up <thought up a plan to escape>
7 : to have as an expectation : anticipate <we didn't think we'd have any trouble>
8 a : to center one's thoughts on <talks and thinks business> b : to form a mental picture of
9 : to subject to the processes of logical thought <think things out>

I don't think, say, a thermostat can do any of those because it doesn't have a mind.

Again, a circular definition. I assume a "mind" is what is required for thinking... but then what's a "mind" ? The result of thinking ? I should hope not.
 
AkuManiMani said:
He makes it abundantly clear that he is talking about simulated carbon.

Thats just the thing. There is no carbon "within the simulation".
Right, there is simulated carbon.

The point I'm making is that the capacity to generate consciousness [i.e. subjective experience] is a physical property of the brain that is medium dependent, in much the same way that electrical conductivity is medium dependent. Essentially I'm arguing that theres a basic underlying physics to consciousness and that it is not simply a computational function. Once we know what the physics of consciousness is there can be serious discussion about to how create it artificially.
This is a fine stance to take, although I'd like to hear some examples of aspects of consciousness that require anything more than computation.

What would it even mean for something to be non-mechanistic anyway? Surely if a phenomenon is produced there must be some means by which it occurs, right?
Not if you're Interesting Ian or various other immaterialists. He asserts that there can be no mechanism to consciousness, because it is im-mechanistic.

~~ Paul
 
AkuManiMani said:
Consciousness experience is qualitatively different than simply computing. It is not sufficient to simply increase the amount or complexity of computation to produce consciousness.
If you could justify this assertion it would be cool.

I truly wish you had the presence of mind to see the sheer absurdity of your own assertions. A simulation of carbon is NOT carbon; its a representation. Whether one writes a comprehensive and mathematically accurate description of a carbon molecule, draws a diagram of it, builds as 3-dimensional model, or instantiates a dynamic computer simulation of a carbon molecule -- no matter how detailed -- it is still just a REPRESENTATION of ACTUAL carbon molecules.
Correct. But what's important is that we're saying that a representation of computation is equivalent to the computation, even if the computation involves nondiscrete values represented to a sufficient precision. So that leaves us with the question of whether consciousness is just computation.

~~ Paul
 
Robin said:
I imagine that the whole brain is necessary for consciousness. Which parts of the brain are unnecessary for consciousness?

Are you asking me to list the non-discrete processes in the brain?
I'm asking you to speculate on them, yes. And even speculate on why simulating them to arbitrary precision wouldn't work.

If I was I would have said so.

I am saying that there is no reasoning that would lead to the conclusion that the brain is algorithmic.

You can simulate a non-algorithmic process so the ability to simulate it does not imply it is algorithmic.

You can run algorithms on a non-algorithmic system so the ability to run algorithms does not imply that it is algorithmic.

The C-T thesis does not say it is algorithmic

No-one can suggest any mechanism whereby a bunch of individual calculations on natural numbers could lead to this unified conscious experience I have.

If the brain is an algorithm then I have to conclude that this unified conscious experience I have might be produced by, not one mechanism, but lots of mechanisms which exist, unconnected and in isolation from each other.

Now I might be persuaded to that view but:

But nobody has yet produced a coherent reason why I should think the brain is an algorithm in the first place.
But if you don't think it is, you must have reasons. I'm asking you to speculate on the reasons. Also, I don't understand this isolation thing; who is talking about the mechanisms being isolated?

I agree that there is no intuitive explanation for why a bunch of mechanisms in the brain can produce consciousness. However, there is no intuitive explanation for how anything in the brain can produce consciousness. Therefore, I don't see that state of affairs as a reason to assume the brain is doing anything noncomputational. It is an empirical question, of course.

rocketdodger said:
No -- you haven't been paying attention.

The consciousness occurs when you program the devices, not when they execute their instructions.

In other words, the "recheck" doesn't mean anything, only the initial run. And everything else is simply a remapping of the initial run. The initial run is the consciousness.
I have no idea what rocketdodger is saying here.

~~ Paul
 
Again, you may have missed that, but that's what I'M saying as well.

If a hypothetical computer/software acts in a way that it can learn and converse just like a human does, would you say that this "simulated intelligence" is not also real intelligence, since it has the very same effects/consequences ?

Lets not get caught in a loop here, Belz. We just agreed that intelligence is not the same thing as consciousness. Conscious experience is a physical effect produced by brain activity and not simply an abstract function of it. This means that the only way an artificial system would produce all the very same effects of a conscious human would be for it to actually reproduce the physical conditions of the human brain, as well as the functional architecture.

We've already discussed at length, in previous threads, the difference between computation [the processing of information] and consciousness [the subjective experience of information]. As PixyMisa himself pointed out, conscious systems are -qualitatively- different than other systems.
 
Last edited:
Robin said:
And maybe someone will supply some one day?
You don't think the brain is at least a Turing machine?

That red herring again.
Sorry, Robin, but you're being too terse. I do not understand what you're trying to say. Are you suggesting that the brain has additional features over and above a Turing machine, but might not necessarily be more powerful than a Turing machine? Are you saying that the brain cannot be simulated on any sort of machine?

~~ Paul
 
Last edited:
Robin said:
By the way, what about my example earlier?

The program goes through first time and produces consciousness and as it does so saves each executed step along with the register values and before data for each value calculated.

These are then run in order.

Is consciousness produced this time?
Ooh, this is fun.

But I don't understand how this saving mechanism works. What exactly is it saving and what does it mean to run the saved states?

Oh, I think I get it. It saves all the source values for each instruction in a packet. Then it runs the packets in order, discarding their outputs. Of course, it can't discard any outputs to I/O devices. Then yes, the consciousness-relevant behavior would be the same.

You know, part of our confusion may be related to what each of us is thinking the "consciousness of the simulation" actually is. Maybe we should talk about that. What is the simplest piece of consciousness a brain could have that we could simulate?

~~ Paul
 
Last edited:
He makes it abundantly clear that he is talking about simulated carbon.

[...]

Right, there is simulated carbon.

Okay, so do we atleast agree that there are essential differences between physical carbon and a simulation of carbon?


AkuManiMani said:
The point I'm making is that the capacity to generate consciousness [i.e. subjective experience] is a physical property of the brain that is medium dependent, in much the same way that electrical conductivity is medium dependent. Essentially I'm arguing that theres a basic underlying physics to consciousness and that it is not simply a computational function. Once we know what the physics of consciousness is there can be serious discussion about to how create it artificially.

This is a fine stance to take, although I'd like to hear some examples of aspects of consciousness that require anything more than computation.

I actually gave the single distinguishing characteristic of consciousness: subjective experience. Like I mentioned in many previous posts, our entire physiology computes information but the only time consciousness [again, the active capacity for subjective experience] is produced is when the brain is in a particular range of physiological states. This indicates that consciousness is a physical effect distinct from merely processing information.
 
Last edited:
AkuManiMani said:
I truly wish you had the presence of mind to see the sheer absurdity of your own assertions[, PixyMisa]. A simulation of carbon is NOT carbon; its a representation. Whether one writes a comprehensive and mathematically accurate description of a carbon molecule, draws a diagram of it, builds as 3-dimensional model, or instantiates a dynamic computer simulation of a carbon molecule -- no matter how detailed -- it is still just a REPRESENTATION of ACTUAL carbon molecules.

Correct. But what's important is that we're saying that a representation of computation is equivalent to the computation, even if the computation involves nondiscrete values represented to a sufficient precision. So that leaves us with the question of whether consciousness is just computation.

~~ Paul

Thats pretty much the gist of the debate right now. I've already presented all the arguments that I deem sufficient to establish that conscious experience is a physical effect produced by the brain as opposed to simply being a computational function. Its up to you whether you consider them convincing or not.

/shrug
 
Last edited:
AkuManiMani said:
Okay, so do we atleast agree that there are essential differences between physical carbon and a simulation of carbon?
Yes.

I actually gave the single distinguishing characteristic of consciousness: subjective experience. Like I mentioned in many previous posts, our entire physiology computes information but the only time consciousness [again, the active capacity for subjective experience] is produced is when the brain is in a particular range of physiological states. This indicates that consciousness is a physical effect distinct from merely processing information.
I don't see why.

~~ Paul
 
Lets not get caught in a loop here, Belz. We just agreed that intelligence is not the same thing as consciousness. Conscious experience is a physical effect produced by brain activity and not simply an abstract function of it. This means that the only way an artificial system would produce all the very same effects of a conscious human would be for it to actually reproduce the physical conditions of the human brain, as well as the functional architecture.

We've already discussed at length, in previous threads, the difference between computation [the processing of information] and consciousness [the subjective experience of information]. As PixyMisa himself pointed out, conscious systems are -qualitatively- different than other systems.

I didn't say it was the same thing. AGAIN you miss my point. I'm saying that, in my opinion, it's in the same class of things that, if simulated, is also actualized.
 
There is a lot of evidence that the brain is at least a Turing machine. So if someone wants to propose that it is more powerful, it seems like they should try to present a compelling argument that it needs to be and why.

To make choices. Something that a Turing machine cannot do, and a RNG can do trivially. It's also something that human beings appear to do - whether they actually do or not.

So the RNG option is at least as plausible to me.

The burden of proof is on both sides. And I, for one, am not claiming that it is the only possibility. I'm just asking for a compelling reason why it can't be.


Which is why running the algorithm might produce consciousness while the static program does not.


I have absolutely no idea what the "jump cut" is.

~~ Paul
 
westprog said:
To make choices. Something that a Turing machine cannot do, and a RNG can do trivially. It's also something that human beings appear to do - whether they actually do or not.

So the RNG option is at least as plausible to me.
I don't see why we can't make a perfectly good simulation of random choices with pseudo-random numbers. Can you give an intuitive reason why the choices have to be truly random?

Also, does the randomness actually contribute to the feeling of consciousness, or only to the choosing itself?

~~ Paul
 
This assertion is going to require backup - i.e. you are actually going to have to point to some specific thing a human can actually do that is non-computational with regards to playing chess.

A human being can enjoy playing chess.
 
So that leaves us with the question of whether consciousness is just computation.

~~ Paul

Since that's the very thing in question, we don't get very far. That we can perfectly simulate computation using computation is very possibly true - though I'm not sure if it's entirely meaningful. We can't perfectly simulate any given operation of computation from one instance to the next, though.
 
A human being can enjoy playing chess.

And why is that non-computational? Deciding what emotion to have in reaction to the current state of the environment is computational.

If the state of "enojyment" is the point of contention then point me to the physical thing that is "enjoyment" and tell me what precludes its existence in things that aren't humans.

Otherwise you cannot say this is not something a computer can do.
 
Last edited:
You don't think the brain is at least a Turing machine?

It's also a way of producing enough heat to warm a hamster cage.

Sorry, Robin, but you're being too terse. I do not understand what you're trying to say. Are you suggesting that the brain has additional features over and above a Turing machine, but might not necessarily be more powerful than a Turing machine? Are you saying that the brain cannot be simulated on any sort of machine?

~~ Paul


If you are using "more powerful" in the very limited and specific realm of algorithmic computation then it's reasonable to say that the brain is no more powerful than a Turing machine, to which one might add "so what?" and "duh". If you mean that a brain can do things that a Turing machine cannot, that is trivially true already.
 

Back
Top Bottom