The Hard Problem of Gravity

There are some things that brains do that silicon-based IBM PC compatibles do not do--for example, metabolize glucose. But there's nothing that brains can do that computers can't do, because brains are computers.

How does that help us? If it turns out (and it's very unlikely) that metabolising glucose is critical to consciousness, how does considering the brain as a computer help? It seems to be just giving it a name.

When you say that the brain is a computer, are you saying that all its functionality is equivalent to a digital computer? Because I strongly disagree with this.
 
When you say that the brain is a computer, are you saying that all its functionality is equivalent to a digital computer?

No, he is saying the set of all things brains can compute is a subset of the set of all things computers can compute.

I expect you will disagree, since the above only follows if you assume that the physical processes taking place in the brain are known physical processes.

So, once again, it all comes down to your magical unknown physical processes.
 
How does that help us? If it turns out (and it's very unlikely) that metabolising glucose is critical to consciousness, how does considering the brain as a computer help? It seems to be just giving it a name.
You're giving computers the same exact name. Think about it.
When you say that the brain is a computer, are you saying that all its functionality is equivalent to a digital computer?
Huh? Pay attention, westprog. You just addressed my saying the complete opposite. The brain is more than "just a computer"--it is a glucose metabolizer, for example. But any other computer is also more than "just a computer".

What I'm doing is showing you what you are doing wrong. You're making claims about what computers--which are more than just computers (mine being a heater, transmitter/receiver, etc), cannot do.

It makes things worse when the things you're claiming that a computer cannot do aren't even incidental to being a computer--when you're talking about effectively calculable tasks such as coming up with a joke as opposed to incidental properties such as metabolizing glucose.
 
Last edited:
Yes, it was, and no, you actually have not, in fact, answered it.

Unless you consider vomited word salad that begs the very question it is supposedly clearing up to be an "answer."

Simply put; all you gotta do is stop pretending that you know what consciousness is and wait until the physical sciences have an asnwer. Until then you're just creating intelligent systems that you merely allege to be conscious.
 
Simply put; all you gotta do is stop pretending that you know what consciousness is and wait until the physical sciences have an asnwer. Until then you're just creating intelligent systems that you merely allege to be conscious.

Yeah -- another non-answer.

I am not really interested in talking to you about this anymore -- you have made it clear that your claims and assertions far outreach your actual knowledge in the relevant areas.
 
You're giving computers the same exact name. Think about it.

I'm not quite sure what I'm supposed to be saying here, so I'll try to explain.

Clearly if we consider what computers can do, if we include everything that something does in addition to its function as a computer, then they can do anything. All we need to do is connect something to a computer, and we have an object that is a computer and also has property X, whatever that may be.

So we can never say, in that sense, that a computer cannot do anything if it is possible at all. However, what we are considering is whether a computer can do some particular task by computing. The particular thing we are interested in is whether computation, as performed on any computer device, without regard to implementation, is sufficient to produce consciousness.

Since the statement "a computer can't write jokes" is thus inevitably false if anything can write jokes, the statement would be better interpreted as "it is impossible for an algorithmic computational process to produce jokes". My own view is that no such algorithmic process has yet been found, and that it may be impossible for an algorithmic process to understand jokes in the same way that a human does.


And when (or if) I write "computers cannot communicate" what I am trying to say is that the exchange of information between computations does not mean that information is communicated in the same sense as it is between humans, when understanding of the information is involved.

If yy2bggggs is seeking for us to avoid using the word "computers" when what we are really talking about are computer programs then he's probably right.
 
Understanding. Interpretation. Experience.

You guys aren't short on dualistic words to try and make consciousness "special". When asked to define those words, however, somehow they always apply to computers, too.

"Dualistic words". Perhaps you could make a list of those bad dualistic words which should be avoided. Go through the dictionary with a black marker.
 
If he's arguing computers can't in principle ever do x, that would be a very difficult argument to make.

AFAIAA, I have not stated this. And I believe that Rocketdodger and Pixy believe that computers - by which I think they mean algorithmic processes - are in principle able to duplicate anything in human experience. I'm not sure if that's what they are saying though, so I'll leave this space


blank for them.
 
And I believe that Rocketdodger and Pixy believe that computers - by which I think they mean algorithmic processes - are in principle able to duplicate anything in human experience.
Close.

Computers are in principle able to simulate any physical system. And a simulation of an information processing system is an information processing system.
 
No. Brains are computers, so if brains do things, computers do them (e.g., brains do them). There are some things that brains do that silicon-based IBM PC compatibles do not do--for example, metabolize glucose. But there's nothing that brains can do that computers can't do, because brains are computers.

I guess we'll need to define "computer" then (if it hasn't been already): a network of switches that operate on [binary] data?

Not just like logic switches in a computer. Neurons are logical switches.

They can directly handle analog signals of course. (I'm differentiating between current-generation PC's and neurons; if the brain is a "computer", it's a moot point).

...not exactly. Anything a brain can do a computer can too, in practice, because brains are computers.

But aren't we interested in whether consciousness is... "substratum-independent", I think is the phrase? To say "a brain is a computer" doesn't tell us whether any computers, other than brains, can generate consciousness, unless we make "generating consciousness" part of the essential definition of "computer". (Maybe this is a side issue; I'm new to the thread.)

AFAIAA, I have not stated this. And I believe that Rocketdodger and Pixy believe that computers - by which I think they mean algorithmic processes - are in principle able to duplicate anything in human experience. I'm not sure if that's what they are saying though, so I'll leave this space

blank for them.

As above, it may turn out that computers can duplicate all the information processing a brain does, but that "consciousness", or at least what we think of consciousness, "human consciousness", is tied to the unique physical composition of the human brain [and body].
 
Last edited:
But aren't we interested in whether consciousness is... "substratum-independent", I think is the phrase? To say "a brain is a computer" doesn't tell us whether any computers, other than brains, can generate consciousness
Church-Turing thesis.

As above, it may turn out that computers can duplicate all the information processing a brain does, but that "consciousness", or at least what we think of consciousness, "human consciousness", is tied to the unique physical composition of the human brain [and body].
Then computers can still generate human consciousness, by simulating that composition.
 
AkuManiMani said:
Simply put; all you gotta do is stop pretending that you know what consciousness is and wait until the physical sciences have an answer. Until then you're just creating intelligent systems that you merely allege to be conscious.

Yeah -- another non-answer.

I am not really interested in talking to you about this anymore -- you have made it clear that your claims and assertions far outreach your actual knowledge in the relevant areas.

Translation:

[QUOTE='Dodger]Yeah -- I don't like your answers.

I am not really interested in talking to you about this anymore -- you have made it clear that you're not going to accept my baseless assertions. In the mean time I'm going to pretend that I actually know how to synthetically create consciousness.[/QUOTE]

Whatever floats your boat.
 
Last edited:
Then computers can still generate human consciousness, by simulating that composition.

Indeed it is impossible to prove that we are not inhabiting a simulation right now.

Not that I think we are -- I just think the fact that we can't prove otherwise speaks volumes.
 
Last edited:
AFAIAA, I have not stated this. And I believe that Rocketdodger and Pixy believe that computers - by which I think they mean algorithmic processes - are in principle able to duplicate anything in human experience. I'm not sure if that's what they are saying though, so I'll leave this space


blank for them.

If they simply left it at "in principle" instead of claiming that its already been achieved I doubt there would be much disagreement. Unfortunately, they haven't established that current systems have [or are even capable of generating] conscious experience.
 
Last edited:
As above, it may turn out that computers can duplicate all the information processing a brain does, but that "consciousness", or at least what we think of consciousness, "human consciousness", is tied to the unique physical composition of the human brain [and body].

I don't think there is much reason to assume that its limited to human brains. Its just that, since we are humans and know that we have the capacity for consciousness, best start for investigation. Once we zero in on exactly what physical process in the brain constitutes conscious experience, and the 'whys' and 'hows' of it, we can extrapolate from there.
 
Last edited:
I guess we'll need to define "computer" then (if it hasn't been already): a network of switches that operate on [binary] data?
A computer is a device that carries out a series of well defined operations. This doesn't have to be binary, or even digital.
They can directly handle analog signals of course. (I'm differentiating between current-generation PC's and neurons; if the brain is a "computer", it's a moot point).
Neurons trigger catastrophically when the "sum" of their inputs (technically some can be, and often are, inhibitory) reaches a certain level. If by analog signal you mean to refer to the strength of inputs, and by "handle" you mean firing, then, no.
But aren't we interested in whether consciousness is... "substratum-independent", I think is the phrase?
Depends on who "we" is. I'm particularly taking an emphatic disinterest in the HPC. My interests lie with making the thread seem less annoying to me, and occasionally popping up when fun actual knowledge about things that do hold my interest arise (e.g., vision). I leave the debate to others.
As above, it may turn out that computers can duplicate all the information processing a brain does, but that "consciousness", or at least what we think of consciousness, "human consciousness", is tied to the unique physical composition of the human brain [and body].
Sounds feasible enough to argue. Your opponents are over there. I'll keep guard and bark from time to time.
 
Last edited:
if they simply left it at "in principle" instead of claiming that its already been achieved i doubt there would be much disagreement. Unfortunately, they haven't established that current systems have [or are even capable of generating] conscious experience.
shrdlu.
 
PixyMisa said:
But aren't we interested in whether consciousness is... "substratum-independent", I think is the phrase? To say "a brain is a computer" doesn't tell us whether any computers, other than brains, can generate consciousness
Church-Turing thesis.

If the universe is a Turing machine (computer), [computable] information can be generalized to any medium?

As above, it may turn out that computers can duplicate all the information processing a brain does, but that "consciousness", or at least what we think of consciousness, "human consciousness", is tied to the unique physical composition of the human brain [and body].
Then computers can still generate human consciousness, by simulating that composition.

Sounds like brain in a vat computation :boxedin: (or maybe clone in a can).

rocketdodger said:
Then computers can still generate human consciousness, by simulating that composition.

Indeed it is impossible to prove that we are not inhabiting a simulation right now.

Not that I think we are -- I just think the fact that we can't prove otherwise speaks volumes.

Bostrom[?]'s Simulation Argument assumes "substratum-independence", so evidence against sub-ind would be evidence against SA [Bostrom's statement of it, at least].

AkuManiMani said:
As above, it may turn out that computers can duplicate all the information processing a brain does, but that "consciousness", or at least what we think of consciousness, "human consciousness", is tied to the unique physical composition of the human brain [and body].

I don't think there is much reason to assume that its limited to human brains. Its just that, since we are humans and know that we have the capacity for consciousness, best start for investigation. Once we zero in on exactly what physical process in the brain constitutes conscious experience, and the 'whys' and 'hows' of it, we can extrapolate from there.

I'm just trying to be 'philosophical' = unassuming & persnickety (see Thomas NagelWP's anti-reductionist point: "What is it Like to be a Bat?"). Animals do seem conscious to me (though I'm not sure whether there's a minimum complexity from which "consciousness" emerges -- and would have to define consciousness first.)

yy2bggggs said:
I guess we'll need to define "computer" then (if it hasn't been already): a network of switches that operate on [binary] data?
A computer is a device that carries out a series of well defined operations. This doesn't have to be binary, or even digital.

(Turing's definition?) Unsurprisingly, much better than mine.

They can directly handle analog signals of course. (I'm differentiating between current-generation PC's and neurons; if the brain is a "computer", it's a moot point).
Neurons trigger catastrophically when the "sum" of their inputs (technically some can be, and often are, inhibitory) reaches a certain level. If by analog signal you mean to refer to the strength of inputs, and by "handle" you mean firing, then, no.

"Analog": continuous input vs discrete: "digital".

But aren't we interested in whether consciousness is... "substratum-independent", I think is the phrase?
Depends on who "we" is. I'm particularly taking an emphatic disinterest in the HPC. My interests lie with making the thread seem less annoying to me, and occasionally popping up when fun actual knowledge about things that do hold my interest arise (e.g., vision). I leave the debate to others.

In terms of consciousness, the difference between 'attentive', 'peripheral', and 'negligible' vision (not sure if these are the trade terms) seems one big clue to how consciousness works; that is, there seem to be different strengths of consciousness, where "consciousness" is our attending to things in our visual field (more generally, in our experiential field).

As above, it may turn out that computers can duplicate all the information processing a brain does, but that "consciousness", or at least what we think of consciousness, "human consciousness", is tied to the unique physical composition of the human brain [and body].
Sounds feasible enough to argue. Your opponents are over there. I'll keep guard and bark from time to time.

"Opponents"?! Can't be... I don't know enough to even know which side I'm on. :p
 
Last edited:
"Analog": continuous input vs discrete: "digital".
Neurons are edge-sensitive, meaning they either fire or they don't. See Neuron/all-or-none principle WP.
In terms of consciousness, the difference between 'attentive', 'peripheral', and 'negligible' vision (not sure if these are the trade terms) seems one big clue to how consciousness works; that is, there seem to be different strengths of consciousness, where "consciousness" is our attending to things in our visual field (more generally, in our experiential field).
Not sure what you mean by negligible. Attentiveness and peripheral aren't parallel classes--you can attend to something that is in your peripheral field. You can also stare right smack dab at something and not notice it (see inattentional blindnessWP).

For what it's worth, attentiveness can affect percepts--see this illusion for an example. So even when you identify these categories, note that they bleed into others.

But yeah, those are the types of things that interest me. Not sure what you mean by negligible. Attentiveness and peripheral aren't parallel classes--you can attend to something that is in your peripheral field. You can also stare right smack dab at something and not notice it (see inattentional blindnessWP).

For what it's worth, attentiveness can affect percepts--see this illusion for an example. So even when you identify these categories, note that they bleed into others.

But yeah, those are the types of things that interest me." target="_blank">WP
 
Last edited:

Back
Top Bottom