dlorde
Philosopher
- Joined
- Apr 20, 2007
- Messages
- 6,864
Ask RocketDodger, it was his use of 'computation'. I was just querying the logic of the subsequent argument.Any examples of computation that doesn't depend on life?
Ask RocketDodger, it was his use of 'computation'. I was just querying the logic of the subsequent argument.Any examples of computation that doesn't depend on life?
Ask RocketDodger, it was his use of 'computation'. I was just querying the logic of the subsequent argument.
Ask RocketDodger, it was his use of 'computation'. I was just querying the logic of the subsequent argument.
So life must be conscious because it is inherently purposeful and consciousness is the source of purpose.
OK; I think that speaks for itself...
I still don't see how computation can be considered to take place in the absence of a conscious mind ... snip ...
Hmm. I think that the concept that life is inherently purposeful is not soundly supported. There's no real difference between the purpose of an amoeba and the purpose of a volcano, looked at objectively.
There's certainly no purpose outside of living, conscious things.
Forgot to add:
Logical equivalence works both ways. Thus,
(A IFF B) --> (B IFF A)
http://whatis.techtarget.com/definition/0,,sid9_gci833433,00.html
rocketdodger said:Thanks for taking the time to do that.
What I have bolded in your quote describes what somebody might learn over time through practice in dealing intelligently with a large number of people.
It's not just self-referentiality there's quite a lot of coreferentiality involved I think it's safe to say.
How does an algorithm pack in what seems like the potentially infinite amount of info needed to make the formalization you laid out?
Thanks in advance for dealing with my questions.
Well it obviously needs to be very complex -- but that isn't necessarily as big a deal as you might think.
Because you need to understand that there is a distinction between the complexity of an instance of an algorithm and the "original" or "base" or "class" or whatever you want to call it, algorithm.
For example, your DNA contains the algorithm(s) needed to describe you exactly as you are -- kind of. But they are only the "template" part of the algorithm, the part people talk about when they say "this algorithm was used to arrive at this state" or something like that. What is omitted in such a statement, but what is implied, is all the data that is included in a particular "run" or "instance" of the algorithm. In the case of you, the "template" of your DNA has produced an instance that has been "running" on data -- the environment of you and your cells -- for a long time.
Obviously, the latter is much more complex than your DNA. It would be impossible to store very much about your current state in your DNA -- there is just too much complexity in you as you are now. People think DNA has tons of storage but they are wrong -- it had tons of storage in 1970 when people's segmented core drives had only kilobytes of memory, but now when you can get terabyte drives for under $50 DNA isn't the masterpiece it used to be. The trick is that evolution led to DNA that relies upon the laws of nature to do most of it's heavy lifting.
For instance, does DNA instruct most biochemicals on how to react? No, of course not. There is no need to -- as long as DNA instructs the cells to make the chemicals, and maybe get them in proximity, basic chemistry takes over and the reactions take place. DNA doesn't even instruct enzymes on how to catalyze -- it just instructs the cells how to make the enzymes and then nature takes over. If you wanted to embed *every* step of such an algorithm in the instructions of DNA it would require orders of magnitude more complexity than is available.
And does DNA instruct your brain on how to act given a certain situation? Does DNA instruct you on how to drive a car, or speak English, or even make fundamental inferences about causation (the most basic task of any brain) ? No, of course not. All DNA has done is given your neurons a very primitive topographical arrangement -- that of a baby. The laws of nature, combined with the environment you grew up in, leads to everything else.
So, long story short, the same kind of tricks can be used to specify a formal algorithm that has emergent behavior vastly more complex than the formalization itself. Yes, the formalization your are asking for would be very complex, but it might not be as complex as you envision, because all that is required is the formalization of steps that are not implicit given the laws of nature. All the other stuff -- the steps that are implicit, that combine with the basic algorithm to generate an instance that has emergent properties and behavior -- can be omitted, and there are very many such steps.
That means you can describe an infinite set of behavior with a finite algorithm. Then what ends up being infinite are the various instances of the algorithm that occur.
Frank Newgent said:rocketdodger said:How one might formalize normative statements such as your quote?
Any statement "X should Y" carries an implicit " .... if X wants Z" with it.
Whether someone says it or not, it is there -- even if only in the head of the person who made the statement.
So for me to say "you should be very suspicious of people who are so obsessed with objective 'good'" is really saying " you should be very suspicious of people who are so obsessed with objective 'good' if you care about people disclosing their true intentions with you, and I assume you do," etc.
Already, we are well on the road to complete formalization. In the expanded context, such as that above, "you should" clearly means something like "the behavior with the highest probability of resulting in reaching your goal, all else being equal, in my opinion, is to ..."
So now we have :
"the behavior with the highest probability of resulting in reaching your goal, all else being equal, in my opinion, is to be very suspicious of people who are so obsessed with objective 'good' if you care about people disclosing their true intentions with you, and I assume you do"
Just to get you further along, I will formalize some of the other terms in there.
"in my opinion" can be formalized to "according to the conclusions reached by logical inference I have performed, possibly subconsciously, and/or insight I have had, the source of which is still not agreed upon by human philosophers/scientists .. ... because such people are often motivated by motives that are not immediately apparent..."
"if you care about" can be formalized to "if it would benefit you somehow, even in a purely psychological manner, for .. to ..."
So now we have:
"the behavior with the highest probability of resulting in reaching your goal, all else being equal, according to the conclusions reached by logical inference I have performed, possibly subconsciously, and/or insight I have had, the source of which is still not agreed upon by human philosophers/scientists, is to be very suspicious of people who are so obsessed with objective 'good' if it would benefit you somehow, even in a purely psychological manner, for people to disclose their true intentions with you, and I assume you do, because such people are often motivated by motives that are not immediately apparent"
Should I go on?
I never said it would be short and sweet. But it can be done.
Thanks for taking the time to do that.
What I have bolded in your quote describes what somebody might learn over time through practice in dealing intelligently with a large number of people.
It's not just self-referentiality there's quite a lot of coreferentiality involved I think it's safe to say.
How does an algorithm pack in what seems like the potentially infinite amount of info needed to make the formalization you laid out?
Thanks in advance for dealing with my questions.
No.Pixy, logical equivalence is expressed by "if and only if"
Reluctant? All I ask is that you get it right.I don't see why you're so reluctant to have your claim reduced to logical terms.
No, this is precisely where you are wrong, and have been wrong the entire time.Pixy, is there any possible world where consciousness occurs without SRIP occuring (or vice versa)? If not, then consciousness occurs IF AND ONLY IF SRIP occurs.
We do not observe self-referential information processing and deduce the presence of consciousness. To observe self-referential information processing is to observe the presence of consciousness.Wikipedia said:Logical equivalence is different from material equivalence. The material equivalence of p and q (often written p↔q) is itself another statement in same object language as p and q, which expresses the idea "p if and only if q". In particular, the truth value of p↔q can change from one model to another.
The claim that two formulas are logically equivalent is a statement in the metalanguage, expressing a relationship between two statements p and q. The claim that p and q are semantically equivalent does not depend on any particular model; it says that in every possible model, p will have the same truth value as q. The claim that p and q are syntactically equivalent does not depend on models at all; it states that there is a deduction of q from p and a deduction of p from q.
No.
Malerin, all you are doing at this point is quote-mining for less precise definitions. Logical equivalence and material equivalence are not the same thing.
Reluctant? All I ask is that you get it right.
No, this is precisely where you are wrong, and have been wrong the entire time.
We do not observe self-referential information processing and deduce the presence of consciousness. To observe self-referential information processing is to observe the presence of consciousness.
I didn't bring it up. You did.I don't know why you keep bringing up material equivalence.
Right. Which is different from material equivalence, which is what you are asserting.You admit your claim is a logical equivalence.
No. It shows that you are wrong and explains why.You keep bringing up the same Wikipedia link (which proves my point anyway).
Right.SRIP is logically equivalent to cosnciousness.
Wrong. It is not a condition. It is the same thing.SRIP is a necessary and sufficient condition for consciousness
Yes. You went looking for less precise definitions. However, you also quoted from a Wikipedia article which explicitly disagreed with you.I've supported this with multiple sources.
Again with the appeal to motive. That's a logical fallacy, Malerin; something you seem all too familar with.You're either being deliberately dishonest or afraid of the logical implications of your claim.
I am making a statement of logical equivalence. You are insisting that this is a statement of material equvialence. It's not.Wikipedia said:Logical equivalence is different from material equivalence. The material equivalence of p and q (often written p↔q) is itself another statement in same object language as p and q, which expresses the idea "p if and only if q". In particular, the truth value of p↔q can change from one model to another.
Oh, certainly. Assuming a false premise and cherry-picking quotes in a futile effort to support it was a complete waste of your time. I suggest you don't do that in the future.Either way, I've wasted enough time with you.
When do you expect to publish?This insight stems from decades of work in neuroscience and computer science.
No. You have perhaps provided a mechanism for SRIP. But that's where any "mechanism" or explanation stops.We agree that subjective experiences are real, and we provide a mechanism for them.
You have basically two things.What leap of faith is that, and why do you believe it is required?
Some things I presume you claim to be having subjective experiences of their own: modern washing machines (already established), the internet, a correctly functioning flush toilet, two people talking to each about what they plan to do together (as a pair, so perhaps there are three conscious entities involved, or do their individual SRIP machines just merge and become one when that happens?), a camera with an automatic focus system pointed at its own image in a mirror, and presumably any running quine. Brainf**k version below (by Brian Raiter):We build self-referential information processing systems right now, and they have subjective experiences.
>>+++++++>>++>>++++>>+++++++>>+>>++++>>+>>+++>>+>>+++++>>+>>++>>+>>++++++>>++>>++++>>+++++++>>+>>+++++>>++>>+>>+>>++++>>+++++++>>+>>+++++>>+>>+>>+>>++++>>+++++++>>+>>+++++>>++++++++++++++>>+>>+>>++++>>+++++++>>+>>+++++>>++>>+>>+>>++++>>+++++++>>+>>+++++>>+++++++++++++++++++++++++++++>>+>>+>>++++>>+++++++>>+>>+++++>>++>>+>>+>>+++++>>+>>++++++>>+>>++>>+>>++++++>>+>>++>>+>>++++++>>+>>++>>+>>++++++>>+>>++>>+>>++++++>>+>>++>>+>>++++++>>+>>++>>+>>++++++>>++>>++++>>+++++++>>+>>+++++>>+++++++>>+>>+++++>>+>>+>>+>>++++>>+>>++>>+>>++++++>>+>>+++++>>+++++++>>+>>++++>>+>>+>>++>>+++++>>+>>+++>>+>>++++>>+>>++>>+>>++++++>>+>>+++++>>+++++++++++++++++++>>++>>++>>+++>>++>>+>>++>>++++>>+++++++>>++>>+++++>>++++++++++>>+>>++>>++++>>+>>++>>+>>++++++>>++++++>>+>>+>>+++++>>+>>++++++>>++>>+++++>>+++++++>>++>>++++>>+>>++++++[<<]>>[>++++++[-<<++++++++++>>]<<++..------------------->[-<.>>+<]>[-<+>]>]<<[-[-[-[-[-[-[>++>]<+++++++++++++++++++++++++++++>]<++>]<++++++++++++++>]<+>]<++>]<<[->.<]<<]
This is not a hard concept:
A bachelor is logically equivalent to an unmarried man.
There is no possible world where a married bachelor exists.
A man is a bachelor IF AND ONLY IF he's an unmarried man (and vice versa).
Being an unmarried man is a necessary and suffficient condition for being a bachelor.
2. Secondly, we each have our subjective experiences. The feeling of pain, being aware of (at least some) of our thoughts, etc. Some might just say qualia to avoid having to spell it out over and over again with a lot of other words that are generally far more ambiguous.
Hmm. I think that the concept that life is inherently purposeful is not soundly supported. There's no real difference between the purpose of an amoeba and the purpose of a volcano, looked at objectively.
There's certainly no purpose outside of living, conscious things.
I'd go out on a limb and say that the behavior of an amoeba is inherently purpose driven in some sense...
The work has been published.When do you expect to publish?
Completely incorrect.No. You have perhaps provided a mechanism for SRIP. But that's where any "mechanism" or explanation stops.
It's an information processor.You have basically two things.
1. Firstly the brain, extremely complicated, using neurons and associated structures to perform actions that appear to be (at least in come cases) a form of computation similar (but certainly not precisely identical) to computation as performed by some artificial neural networks (originally created to loosely model the computational aspects of the brain).
It's self-referential.2. Secondly, we each have our subjective experiences. The feeling of pain, being aware of (at least some) of our thoughts, etc. Some might just say qualia to avoid having to spell it out over and over again with a lot of other words that are generally far more ambiguous.
It's not a leap of faith, it's understanding the question.Leap of faith: Consciousness is SRIP.
As I keep saying to Malerin, "logically equivalent" means "one and the same thing". So both those are correct.Not even that (1) might cause (2), or vice versa, by some as yet not understand physical process, but in fact you jump straight to stating they are one and the same thing. SRIP appears to almost be a certainty in the human brain but for what other reason should we have any particular reason to believe that it is one and the same thing as subjective experience? Or logically equivalent if that's something different?
Yes, the more complex ones perform self-referential information processing.Some things I presume you claim to be having subjective experiences of their own: modern washing machines (already established)
Why and how?the internet
Why and how?a correctly functioning flush toilet
Why and how?two people talking to each about what they plan to do together (as a pair, so perhaps there are three conscious entities involved, or do their individual SRIP machines just merge and become one when that happens?)
Why and how?a camera with an automatic focus system pointed at its own image in a mirror
Now that one is actually interesting. Yes, plausibly.and presumably any running quine.
My first response is, yes. That is conscious.Of course this quine is also completely deterministic, and requires no external input. But clearly self-referential information processing, right? How about a BF self-interpreter running a copy of itself running this same quine?
Dennett goes into that in some detail, explaining why thermostats - which are equivalent to a flush toilet - are not conscious, are systems too limited in their representational ability to be considered in that category.On the other hand, I assume that you agree that a flush toilet with a broken float valve is completely unconscious. (Cistern fills up and then water goes down overflow pipe until someone flushes rather than cistern filling up and valve closing due to correctly functioning float.)
A whole bunch of it, yes, as far as I can tell. Though the quine example (a program that produces its own source code as output) is a very interesting one, and of course one that Hofstadter used in his books.Is any of this wrong?
You can wake up based on sensory input. There is still something going on there.Do you think there is or should be any difference between the subjective experience of being dead (as you were before you were conceived) and in an extremely deep and sound sleep?
As I keep saying to Malerin, "logically equivalent" means "one and the same thing". So both those are correct.
The only thing I've done here is cut away the waffle. Consciousness is information processing, and it's self-referential. There you go. We're done. All of this is established fact.
If you want to add something to that, go ahead. What do you want to add, and why?
Reading what I've bolded I hope you understand you have not answered the question at all.
You say "all that is required is the formalization of steps that are not implicit given the laws of nature."
OK fine. I'll refer back to your original quote now:
Again, "all that is required is the formalization of steps that are not implicit given the laws of nature".
OK, let's leave out what is implicit given the laws of nature.
There, it's done.
Now, leaving out what is implicit given the laws of nature, how will your algorithm pack in what seems like the potentially infinite amount of info needed to make the formalization you laid out?
Your formalization, again, being: "the behavior with the highest probability of resulting in reaching your goal, all else being equal, according to the conclusions reached by logical inference I have performed, possibly subconsciously, and/or insight I have had, the source of which is still not agreed upon by human philosophers/scientists, is to be very suspicious of people who are so obsessed with objective 'good' if it would benefit you somehow, even in a purely psychological manner, for people to disclose their true intentions with you, and I assume you do, because such people are often motivated by motives that are not immediately apparent".
I don't mean to go in circles here but it seems that can't be helped. You either cannot address the question I pose or you are doing what you can to sidestep it.