• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Automatons

Correct, we can create a human being.

We'd like to be able to create the software without having to have the wetware present. Do you think that this is a worthy goal?

Do you think that there is a non-material component to the human intelligence?

If I believed that human intelligence was software, then I'd believe it was entirely non-material.

To assume that human intelligence is software is to make a judgement about something which we don't yet know.

I think that to endeavour to duplicate human intelligence is a worthy endeavour, even if it turns out to be futile. That is not to say that it is without risks.
 
Hi

I've got a question for you guys:

Manufactured (in the original sense) machines, as long as they don't consume some essential part of themselves in operation, such as thermal batteries and pressure accumulators, can be turned back on once they've been turned off.

If we turn off a living organism, we can't turn it back on.

Do you think this is a matter of resources consumed (food and oxygen, for instance), or is it a matter of the complexity of the machine and, "turned off," being too widely distributed, or some combination of both?

(Please note that I do not ask about a, "soul," about which my beliefs seem to be at odds with both mainstream Christians and atheists, both. I'm interested in the shared concept of the nature of a, "machine," here.)
 
I don't see why the fact it hasn't happened yet means it won't ever happen. It's logical to assume that processing power will continue to progress and that someday, scientists will take advantage of it by developing advanced AI.
There is no soul and no afterlife.

I know it is hard to accept this when you are young, heck, its even difficult for some people when they are old.

But this is how it is, or rather, there is no reliable evidence that this isn't obviously the case.

If it is any help, I was terrified when I realized this in my youth, just as you perhaps are terrified. But I'm OK about it now. I suspect you will´be too in a decade.
 
Please do. Why is a product of evolution not a machine?
Word doesn't mean what you think it does.

Machines began as things made by man, not things found in nature that had evolved. By use of analogy, usage of the word machine started to morph and bleed into the utter crap you are playing at here.

You are confusing the dance of word meanings for things that are inherently different. Machines, tools, are sometimes emulations of things (see planes for emulations of birds) observed and that people tried to replicate or mimic. Such synthesis is sometimes successful, sometimes not. (See synthetic oil and fuels. )

Your woo seems to derive from assuming that all things can be replicated. Please, carry on with your woo. But try to understand that semantic games don't change what things are, only how you bat them around on a forum.

DR
 
Hi

I've got a question for you guys:

Manufactured (in the original sense) machines, as long as they don't consume some essential part of themselves in operation, such as thermal batteries and pressure accumulators, can be turned back on once they've been turned off.

If we turn off a living organism, we can't turn it back on.

Do you think this is a matter of resources consumed (food and oxygen, for instance), or is it a matter of the complexity of the machine and, "turned off," being too widely distributed, or some combination of both?

Complexity of the processes and the fragility of the materials that support them would seem to be the key. The thing that differentiates animate from inanimate organisms is metabolism. For the sake of discussion, I mean the ability to draw from environmental resources and transform them into usable energy. That which we call "alive" exhibits this ability to sustain itself, in a way that mere machines cannot. Evidence to date indicates that this requires a particular mix of molecules in a particular structure which is sustained by the energy acquired from an organism's environment. OTOH, as you observe, manufactured machines depend on external energy only for their ability to function, and not for their ability to exist, per se; when the energy source is disturbed, function ceases, yet, given that corruption of the material of the machine has not occurred, the machine can continue to function when the energy is re-applied.

I realize that I am having difficulty expressing this distinction clearly and succinctly, and would appreciate any help towards elaborating on this.

As this applies to the OP, there is no reason to assume that an eternal soul is a prerequisite for the ability to experience value--the sense that some things or situations are "better" than others. Nor is there reason to assume that the lack of a Creator implies the lack of an eternal soul. Yet if one accepts neither soul nor God, that still doesn't preclude choice--particularly considering the number of examples of non-linear thinking and the resultant behavior among humans ;)

You can choose a ready guide
In some celestial voice
If you choose not to decide
You still have made a choice

You can choose from phantom fears
And kindness that can kill
I will choose a path that's clear
I will choose free will


©1980 Peart, Lee, Lifeson

(Please note that I do not ask about a, "soul," about which my beliefs seem to be at odds with both mainstream Christians and atheists, both. I'm interested in the shared concept of the nature of a, "machine," here.)

I hope the tenor of this thread will serve as an example of how such subtle issues can be approached, and discussed, in the spirit of sharing insights "in a friendly and lively way" ;)
 
... machines, as long as they don't consume some essential part of themselves in operation, such as thermal batteries and pressure accumulators,<insert>and/or some essential part of themselves isn't corrupted during 'dormancy'</insert> can be turned back on once they've been turned off.
FTFY :)

Try starting a car with a rusted head

Try booting up a computer with a 'cooked' CPU

Sure, you can fit new parts...

I've had this axe for 42 years...
it's had 17 new handles...
and nine new heads...
Best damn axe in teh whirled
 
Hi

FTFY :)

Try starting a car with a rusted head

Try booting up a computer with a 'cooked' CPU

Sure, you can fit new parts...

I've had this axe for 42 years...
it's had 17 new handles...
and nine new heads...
Best damn axe in teh whirled


Woops. Thanks for the correction.

I hadn't thought of long-term being-turned-off-itude.

So: Do you think that living organism type machines (if machines they are) degrade so quickly after being turned off that they're difficult to restart?
 
Last edited:
Really? Do you have any evidence to back this up? Assuming the current model, was the singularity a state of non-existance?

sure. why not? When the power goes out, the default setting of the computor is off.
 
So: Do you think that living organism type machines (if machines they are) degrade so quickly after being turned off that they're difficult to restart?

That's it, in a nutshell. A "living organism" relies on its own metabolism to maintain its functional state.
 
...why bother to make things better if the world doesn't matter? Who cares what difference you make? If there's no reason or purpose, then this is simply a sphere of matter orbiting around a star in an observable physical universe that goes on existing, pointlessly. Those "real people" are just the latest generation of mathematically evolved cell structures that will reproduce more structures before dying off. Their appearance of being alive is an illusion--their brains are cell-based automatons, no more alive or of value than a rock tumbling down a hill.

...The ultimate, objective conclusion on the physicality of the universe would be that morals, emotions, and social norms are illusions of the brain, evolved out of necessity for survival. That means there's no difference between a philanthropist donating to charity or a homicidal maniac killing ten people. To the universe, it doesn't matter.

Who cares what impact we leave behind? There's no eternal god to care, and future generations are robots who will live through their generation and die as we will, so their thoughts--nothing more than biochemical signals passing through some cells--have no meaning anyway.

Hi Toady, I've quoted you in full above because I think you sum it up very well. But the operational expression is to the universe, it doesn't matter.

But you are not the universe, you are a creature of the universe. For very good and practical reasons you have evolved to experience all those things from pain to love to altruism to hate. You are also embedded in a community of beings much like you. Your personal 'enjoyment' - a concept utterly meaningless to the universe, but very meaningful to you - will depend very much on whether you engage in philanthropy or become a homicidal maniac. If you are searching for ultimate 'meaning' on the cosmic scale, then you'll find there isn't any; but you will find plenty of meaning in the day-to-day business of being a thinking primate. And when you die you won't lament that dismal event any more than you complained about not being born yet.

And what's wrong with that? Nothing at all, welcome to reality. Cool.

Am I coming off as overly analytical here? :) I don't mean to. This is just a realization that bothers and fascinates me, though I know it's nothing new.

Not at all. I only wish more people were so.
 
If I believed that human intelligence was software, then I'd believe it was entirely non-material.

To assume that human intelligence is software is to make a judgement about something which we don't yet know.

We do know our thought processes are as dependent on the physical structure of our brains as they are on electrical/chemical signals. Any simulation of the human mind would have to model the brain as well, but there's no reason that can't be done.


If you can't override the prediction, do you have free will then? (at least not absolutely free will).

If you have no knowledge of this prediction, how does someone else predicting what you are going to do in any way affect your behavior? Being predictable doesn't mean you don't have free will.

If you are told the prediction, then the prediction would have to predict your reaction to the prediction as well.

For example, if it was predicted that you were going to have toast for breakfast, you might respond: "Damn right I am, I like toast". On the other hand, just to be contrary, you might respond: "Screw that, I'm having cornflakes instead!"

The end result would be ridiculous recursive predictions like "I predict that if I had predicted you would have had toast for breakfast, you'd have had cornflakes instead." or "I predict that if I had predicted you'd have had cornflakes if I had predicted you'd have had toast, then you'd have had eggs instead."

The question of whether or not a hypothetical omniscient prescient entity deliberately manipulating me into eating cornflakes for breakfast by slyly predicting I was going to have toast is a violation of my free will is something I'm content to leave for others to argue about for the time being.


So: Do you think that living organism type machines (if machines they are) degrade so quickly after being turned off that they're difficult to restart?

As far as I'm aware, living organisms don't come with an off switch, they just keep on running until they break down. Depriving a cell of oxygen is a bit like driving a car full speed down the freeway with the radiator disconnected.
Once the engine blows up, you don't expect it to start again. :)


H.A.L.

Heuristically programmed ALgorithmic Computer.

...and Heuristics constitute a "fairly simple algorithm," where, exactly?


Wikipedia said:
Heuristics are intended to gain computational performance or conceptual simplicity, potentially at the cost of accuracy or precision.

Heuristic programming is about simplicity (or computational speed). Although, I admit I should have said "relatively simple" insted of "fairly simple".


Your woo seems to derive from assuming that all things can be replicated. Please, carry on with your woo. But try to understand that semantic games don't change what things are, only how you bat them around on a forum.

Biologists have already manually assembled simple life forms such as viruses from basic molecules, and have simulated them on computers down to the quantum level. Why do you believe that this process cannot be scaled up (in theory if not practice) to accommodate larger organisms, such as humans?

(If it ever does become possible to replicate an intact individual human mind complete with memories in a computer simulation, I doubt it will happen in my lifetime.)
 
That Blue-Brain idea strikes me as so immoral. A model of a brain will behave just like a real brain -- sentient and all.

You'd be creating a sentient being and then simulating the effects of neurological diseases on it and stuff (one of the reasons they want to create this thing). So basically creating a sentient being and using it for research. So ethical.
 
Word doesn't mean what you think it does.

Duh. Thats why I asked him the question instead of simply vomiting a self-righteous flame into the forum.

Machines began as things made by man, not things found in nature that had evolved. By use of analogy, usage of the word machine started to morph and bleed into the utter crap you are playing at here.

So you are claiming that the meaning of a word is not allowed to change over time? And you are accusing me of "utter crap?"

You are confusing the dance of word meanings for things that are inherently different.

No, I am not. I am not confusing anything, because unlike you, who obviously knows everything, and obviously knows that you know everything, I don't know everything.

Hence, in a discussion, I like to discuss -- by "asking" people what they mean (as opposed to simply telling them what they mean, and that their meaning is wrong). I put "asking" in double quotes because it is clearly a process you are unfamiliar with.

Your woo seems to derive from assuming that all things can be replicated. Please, carry on with your woo.

Hmmm... does your woo derive from the contrary assumption?

But try to understand that semantic games don't change what things are, only how you bat them around on a forum.

Well... I thought I did... but clearly, since the great darth rotor has flamed me regarding this issue, I must have been wrong.

Or... I could just ignore your post, not only because it is simply wrong but also because it is teeming with unsolicited hostility. Which reminds me... aren't most of your responses like that, Darth?
 
That Blue-Brain idea strikes me as so immoral. A model of a brain will behave just like a real brain -- sentient and all.

You'd be creating a sentient being and then simulating the effects of neurological diseases on it and stuff (one of the reasons they want to create this thing). So basically creating a sentient being and using it for research. So ethical.

Their stated goal is to "reverse-engineer the mammalian brain". As they're currently simulating part of a rat brain, I don't think see any ethical problem with it. (Yet.)

If they start working on simulating an entire human (or primate) brain, then it's time for the ethics committee to step in and set a few guidelines.
 
No particular reason. That's just the unit I chose off the top of my head. Substitute with whatever microscopic unit you see fit to use.
The term "cell-based automaton" sounds like a straw man version of materialism.

I reject the notion of a soul (due to lack of evidence), but I do not think that my mind is equivalent to a collection of any microscopic units of matter.

Again, the atoms are organized mostly into molecules which are mostly organized into cells, which are organized into tissues which are organized into other structures, which are organized into organs. At every level of organization there are emergent properties.

Calling a human a "cell-based automaton" is no more valid than calling him an atom-based automaton. The stuff that we call mind or consciousness emerges only at higher levels of organization (higher than the cell, certainly).

So the argument that says humans either have souls or are cell-based automata, is fallacious.

I say we don't have souls, AND we're not cell-based automata.
 

Back
Top Bottom