• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Has consciousness been fully explained?

Status
Not open for further replies.
Rather like others here who when asked to discuss consciousness discuss everything but consciousness.

Of course the good news is that we are, hopefully, millenia away from buiding a non-biologic machine that is conscious in a human (or super-human) sense, since it should take that machine a few nanoseconds to plan the demise of it's competitor, humanity.

I, for one, welcome our new robot overlords.
 
As I understand it, it has been an interesting connumdrum of artificial intelligence studies that it is far harder to get a computer to do something along the lines of 'tie your shoelaces' than it is to program a computer to dispense legal or medical advice. I don't know much about sonnets, but my hubby once wrote a very simple program to compose Haikus. Some were quite delightful. But he wasn't claiming that the program composing the poetry conscious. It was just a simple algorithm that combined random words from various lists in a specified order that met specified criteria.

Anyway, my point is that I find it quite believable that the process of eating lunch might be more representative of consciousness than composing poetry, particularly if 'eating lunch' includes such things as deciding what to eat, obtaining the ingrediants, preparing it ahead of time, etc. and is not referring to just the actions of fork picking up food, delivering to mouth and chewing.

Composing a symphony or writing a sonnet is simply a matter of rearranging a number of symbols according to some simple rules. I can well imagine that a program could do something like that. But listening to a symphony, reading a sonnet - that would mean a lot more.
 
Composing a symphony or writing a sonnet is simply a matter of rearranging a number of symbols according to some simple rules. I can well imagine that a program could do something like that. But listening to a symphony, reading a sonnet - that would mean a lot more.

There's a whole lot of point-missing going on with regard to the computer composing a sonnet.

No one claimed that sonnet-composing was evidence of consciousness, as in "if it can write a sonnet, it is conscious." This is a (probably unintentional) strawman.

The point was one about the simulation of something and the actualizing (or reification, or whatever--language fails) of something.

If a computer adds 2 and 2 and gets 4, we don't say that the computer simulated the calculation. The computer ACTUALLY performed the calculation.

If a computer composes a sonnet, we don't say the computer simulated the composition of a sonnet. We say the computer composed a sonnet.

Just because we can point to the algorithm a computer uses doesn't mean that it's a simulation. Just because we don't understand how a poet composes a sonnet doesn't mean it's *not* algorithmic.
 
When a 'man made' machine says "I don't want to do that...Dave"..

That's the time to be scared...

I'd kick it straight in it's computer nuts...

DB
 
Composing a symphony or writing a sonnet is simply a matter of rearranging a number of symbols according to some simple rules. I can well imagine that a program could do something like that. But listening to a symphony, reading a sonnet - that would mean a lot more.

Why? You're merely assuming your conclusion, that what happens in a mind is fundamentally more than what could happen in a computer.

Provide some evidence (or even an argument) that there can be no "what it's like" for a symphony-listening computer.
 
Provide some evidence (or even an argument) that there can be no "what it's like" for a symphony-listening computer.
Er, you have the matter of providing evidence backwards. Provide evidence you 'know what it's like'.

I 'know what it's like' listening to a symphony; many I don't particularly care for. How that relates to you and "what it's like" for you no one else except you will ever know.
 
We don't know what is necessary for consciousness. Until we do, we can't say that consciousness can be emulated by simulation, since there are many human behaviours which involve consciousness which are not equivalent to their simulation. Merely selecting a subset of such behaviours which a computer simulation might be able to do (though it currently cannot) does not demonstrate that simulation of human behaviour is necessarily the same as emulation.

Yeah yeah, you have said all of this before.

But what you refuse to address -- invariably -- is the fact that every time someone enumerates a behavior that is NOT equivalent to its simulation, it turns out that everyone also agrees said behavior is not a requisite for consciousness. As you just did yourself.

So how long until you just admit that the list of behaviors everyone agrees are probably requisite for consciousness is also a list of behaviors that are equivalent to their simulations?

Or are you going to play this stupid game forever? Are you going to keep parroting your excuse line "well, there *might* be ..." without ever being specific?
 
Last edited:
When a 'man made' machine says "I don't want to do that...Dave"..

That's the time to be scared...

I'd kick it straight in it's computer nuts...

DB

What if that man-made machine is your son? (Assuming your name is Dave...)
 
There's a whole lot of point-missing going on with regard to the computer composing a sonnet.

No one claimed that sonnet-composing was evidence of consciousness, as in "if it can write a sonnet, it is conscious." This is a (probably unintentional) strawman.

The point was one about the simulation of something and the actualizing (or reification, or whatever--language fails) of something.

Point taken.

If a computer adds 2 and 2 and gets 4, we don't say that the computer simulated the calculation. The computer ACTUALLY performed the calculation.

If a computer composes a sonnet, we don't say the computer simulated the composition of a sonnet. We say the computer composed a sonnet.

Just because we can point to the algorithm a computer uses doesn't mean that it's a simulation. Just because we don't understand how a poet composes a sonnet doesn't mean it's *not* algorithmic.

And it doesn't mean that it is.

Is a computer actually calculating when it adds two and two? Are rocks performing a calculation when two fall off a cliff and land next to another two? What the computer is doing is not necessarily in any way comparable to what the human being is doing. Yes, the end result of printing something on paper might be the same - but if we consider when a symphony or a poem has been created - it's at the point when the creator has the concept in his mind. This does not necessarily compare with the simulation of the process in a computer.
 
Er, you have the matter of providing evidence backwards. Provide evidence you 'know what it's like'.

I 'know what it's like' listening to a symphony; many I don't particularly care for. How that relates to you and "what it's like" for you no one else except you will ever know.

Point taken. I should have made the parenthetical the main point, and left out the call for evidence (since evidence of another entity's subjective experience is in principle impossible to obtain).

I haven't seen an argument why a symphony-listening computer wouldn't have something analogous to our subjective experience.
 
And it doesn't mean that it is.
Fair enough. My point stands.

Is a computer actually calculating when it adds two and two? Are rocks performing a calculation when two fall off a cliff and land next to another two?

Is this seriously a road you want to go down? That calculation performed by a human is somehow qualitatively different than calculation done by a (non-human) machine? Some sort of calculus-qualia?
 
Point taken. I should have made the parenthetical the main point, and left out the call for evidence (since evidence of another entity's subjective experience is in principle impossible to obtain).

I haven't seen an argument why a symphony-listening computer wouldn't have something analogous to our subjective experience.

First catch your rabbit. We have no idea how to create a symphony-listening computer, or even what that means.
 
Fair enough. My point stands.



Is this seriously a road you want to go down? That calculation performed by a human is somehow qualitatively different than calculation done by a (non-human) machine? Some sort of calculus-qualia?

Yes, I absolutely do. I can use an abacus to add two and two. What it does bears simply no relationship to my calculation of two and two. Indeed, I question whether the abacus can be said to calculate the sum until a conscious human has interpreted it.
 
Yes, I absolutely do. I can use an abacus to add two and two. What it does bears simply no relationship to my calculation of two and two. Indeed, I question whether the abacus can be said to calculate the sum until a conscious human has interpreted it.

You are free to define calculation as something that only a human can do. But what does that get us?
 
First catch your rabbit. We have no idea how to create a symphony-listening computer, or even what that means.

So, if something doesn't exist and we can't currently build it, it's off limits for discussion? The "what it means" part is exactly what we're discussing.

*Ping!* Deflection deflected.
 
You are free to define calculation as something that only a human can do. But what does that get us?

It gets us to where we actually are, instead of pretending that we are half-way to the artificial mind. We understand very little about understanding.
 
It gets us to where we actually are, instead of pretending that we are half-way to the artificial mind. We understand very little about understanding.

I'm pretending nothing like that.

If we define "calculation" to mean "what humans do when we add two and two", then the conclusion follows by definition that a non-human entity cannot calculate.

How is this not a no-true-Scotsman fallacy?
 
I haven't seen an argument why a symphony-listening computer wouldn't have something analogous to our subjective experience.
Nor have I seen any argument that there is even a possibility in principle that would allow that computer 'subjective' experience.

The only way out is to deny that you, or I, have 'subjective' experience either.
 
Status
Not open for further replies.

Back
Top Bottom