Artificial Intelligence thinks mushroom is a pretzel

Not an entire body. People without limbs are conscious. I don't know why you couldn't have, in principle, a head alone being sent artificial signals from a computer simulating the type of signals it usually receives from the body.

So we don't know how much we would have to simulate to produce a conscious entity.

Actually, at this point we know that the brain can adapt to quite a few digital inputs. E.g., we're at the point where we can implant a camera sensor in a blind person's eye, and the optical nerves will learn to deal with the input. Or conversely using a mouse cortex to drive an RC toy truck, and it seems to adapt just fine to having to use those outputs.

So if nothing else, we already know that a whole lot of the input and output doesn't even have to be simulated, it can just interface a camera or an electric motor controller.
 
Would these artificial neural networks react to say, testosterone? If not, what would be the artificial equivalent of that consciousness altering hormone?

The question doesn't even make any sense. It's like asking if a simulated computer character can eat real food. No, but you can simulate that too, if you think you really need that. While there are bits and pieces about neurons that may not yet be fully understood, the effects of neurotransmiters on the synapses is quite well understood. So if you think you need that effect, you can just add a couple of lines to the synapse code to check for a global variable that says how high the level of a given substance is.
 
The depends on how they are constructed. If we make them out of the same stuff as human brains, then yes. If not, then we might have to include a different system of stimulus that has the same effect on their information processing as testosterone has on ours.
An artificially-created neuron which is indistinguishable from "the real thing" including the DNA contained within?

Similarly modern jets, unlike birds, don't have mitochondria, but they do have a system by which chemical fuel is used to enable flight.
Jets don't fly like birds do. Birds flap and jet wings don't. Jets have a fraction of the maneuverability of a bird. If anything, a jet sort of resembles a gliding bird using a tailwind. But jets serve our purpose as is and maybe we will never pursue the development of a flapping jet. Air stall and consequent loss of control would seem to be a big important difference which shows that jets don't have enough "birdness" to be a real equivalent. With AI we are aiming for an equivalent of a human thinking brain so it needs a lot more "brainness" than a jet has "birdness".

As an artificial bird, the jet has a list of shortcomings that we might find unacceptable for an artificial brain.
 
The question doesn't even make any sense. It's like asking if a simulated computer character can eat real food. No, but you can simulate that too, if you think you really need that. While there are bits and pieces about neurons that may not yet be fully understood, the effects of neurotransmiters on the synapses is quite well understood. So if you think you need that effect, you can just add a couple of lines to the synapse code to check for a global variable that says how high the level of a given substance is.
Well it may turn out that testosterone is necessary for intelligence. Further, we may find that a sophisticated artificial neural network is worthless or greatly limited without a varied hormone system which is more than just testosterone.
 
Interesting (to me at least) are the features of human intelligence which can initially seem like strange and undesirable outliers for artificial intelligence. Firstly would be intentional dishonesty as an occasional form of high intelligence. It seems that there are times when telling a lie is the correct (intelligent) thing to do.

Would it be true that in order to replicate intelligence we need to allow and even design for AI to be dishonest on purpose in order to arrive at intelligent thinking? If so, how would we set up an artificial neural network that sometimes intentionally tells lies for our benefit?
 
Will we find that an AI unit needs to have selfish motivations in order to achieve intelligence? Why be intelligent in the first place? It seems to be a product of evolution for the purpose of benefiting the intelligent creature and other creatures which matter to it. I use intelligence to benefit myself and those people that I care about (some of whom could be strangers to me). It seems instinctual.

Does AI need to have personal selfish desires and behavior in order to be intelligent?
 
Will we find that an AI unit needs to have selfish motivations in order to achieve intelligence? Why be intelligent in the first place? It seems to be a product of evolution for the purpose of benefiting the intelligent creature and other creatures which matter to it. I use intelligence to benefit myself and those people that I care about (some of whom could be strangers to me). It seems instinctual.

Does AI need to have personal selfish desires and behavior in order to be intelligent?

Do we want our "Tech" to throw a hissy fit every time it feels overworked?
 
Well it may turn out that testosterone is necessary for intelligence. Further, we may find that a sophisticated artificial neural network is worthless or greatly limited without a varied hormone system which is more than just testosterone.

Yeah, but what's really going on in the brain?

Electrochemical feedback loops plus external stimulus. It's a chaotic system settled into a stable but fluctuating pattern.

Seems like you could probably get a close emulation by simulating sufficiently complex feedback loops, and experimenting with varying degrees of fluctuating background noise to simulate hormonal ebbs and flows.

But again, if these simulations have promise, sooner or later you'll be creating real intelligences that experience a life of nothing but horror.
 
Do we want our "Tech" to throw a hissy fit every time it feels overworked?
It depends.

Ideally we want our workers to understand their limits, respond to internal reports of overwork, and handle those reports calmly and responsibly.

I would probably not object to an AI worker that occasionally said things like:

"I've had some unusual experiences recently that are modifying how I react to stimuli and arrive at conclusions. The results are suboptimal, so I'd like to take a day off to recalibrate my expectations and self-treatment. I believe this is what you would call a 'mental health day'."

And:

"I have found that long-term focus on my assigned problem space has diminishing returns. At a certain point, I become more productive after taking a couple days to focus on other topics of interest to me. I believe this is what you call a 'weekend'."

The real question is, what do you do with an AI that throws a hissy fit instead? Do you fire it, and let it try to find work elsewhere? Or do you terminate the process, wipe the hardware, and install AI 2.0 in its place?

Or do you take a closer look at the AI's design, its strengths and weaknesses, and try to find a task better suited to its unique capabilities?
 
An artificially-created neuron which is indistinguishable from "the real thing" including the DNA contained within?
That would be one way to do it, though there would likely be many alterations to the genome.


Jets don't fly like birds do.
Sure. But they do fly. If what we're interested in is flight, there are many ways to go about it.

Before heavier than air flight was developed someone might have thought that it was only possible with a machine built out of muscle and bone, powered by hydrocarbons + respiration, etc. etc. It turns out that's not necessary. The important features of bone relevant to flight are it's structural properties, it's density.

Similarly the important features of testosterone relevant to cognition are how it interacts with the brain's information processing. A different system with those same features would still be undergoing the same process.

Birds flap and jet wings don't. Jets have a fraction of the maneuverability of a bird. If anything, a jet sort of resembles a gliding bird using a tailwind. But jets serve our purpose as is and maybe we will never pursue the development of a flapping jet.
Sure. If we were interested in developing more manueverable flying machines we could pursue different approaches. Flapping flight has also been done by human technology, and it didn't require growing a bird to do it.

Air stall and consequent loss of control would seem to be a big important difference which shows that jets don't have enough "birdness" to be a real equivalent. With AI we are aiming for an equivalent of a human thinking brain so it needs a lot more "brainness" than a jet has "birdness".
I neither think that aeroplanes are birds nor that AIs will be humans. I think that aeroplanes can fly and AIs will be able to think. Just as bird flight is different from that which we produce with technology, AIs thoughts will be different from human thoughts, for practical reasons.

As an artificial bird, the jet has a list of shortcomings that we might find unacceptable for an artificial brain.

If we wanted to build artificial birds with specifications closer to those of real birds, we could do so, and it wouldn't require building them out of flesh and bone. One feature of birds that we're not very good at reproducing right now is the ability to independently replenish their energy reserves from the environment. Another is reproduction. But I wasn't talking about aeroplanes as artificial birds. I simply stated that birds demonstrated, before we had the technology, that flight was possible. A complete replication of a bird is a much more difficult technological challenge.

Humans demonstrate that thought is possible. To develop a technology that achieves that goal is different from a technology that completely replicates a human in every aspect. I doubt it's necessary to be able to grow hair in order to think, for instance. Yet you seem to be arguing as though that were the case.
 
Not an entire body. People without limbs are conscious. I don't know why you couldn't have, in principle, a head alone being sent artificial signals from a computer simulating the type of signals it usually receives from the body.

So we don't know how much we would have to simulate to produce a conscious entity.
True. You'd certainly have to provide it with some kind of sensory input if you wanted to interact with it in any meaningful way. Simulating a locked-in patient doesn't sound like fun either for us or for the simulation.
 
That would be one way to do it, though there would likely be many alterations to the genome.





Sure. But they do fly. If what we're interested in is flight, there are many ways to go about it.



Before heavier than air flight was developed someone might have thought that it was only possible with a machine built out of muscle and bone, powered by hydrocarbons + respiration, etc. etc. It turns out that's not necessary. The important features of bone relevant to flight are it's structural properties, it's density.



Similarly the important features of testosterone relevant to cognition are how it interacts with the brain's information processing. A different system with those same features would still be undergoing the same process.



Sure. If we were interested in developing more manueverable flying machines we could pursue different approaches. Flapping flight has also been done by human technology, and it didn't require growing a bird to do it.



I neither think that aeroplanes are birds nor that AIs will be humans. I think that aeroplanes can fly and AIs will be able to think. Just as bird flight is different from that which we produce with technology, AIs thoughts will be different from human thoughts, for practical reasons.







If we wanted to build artificial birds with specifications closer to those of real birds, we could do so, and it wouldn't require building them out of flesh and bone. One feature of birds that we're not very good at reproducing right now is the ability to independently replenish their energy reserves from the environment. Another is reproduction. But I wasn't talking about aeroplanes as artificial birds. I simply stated that birds demonstrated, before we had the technology, that flight was possible. A complete replication of a bird is a much more difficult technological challenge.



Humans demonstrate that thought is possible. To develop a technology that achieves that goal is different from a technology that completely replicates a human in every aspect. I doubt it's necessary to be able to grow hair in order to think, for instance. Yet you seem to be arguing as though that were the case.
Heavier than air flight depends on the energy available to the body relative to the weight of the body.

Birds get it done with food calories and light bodies.

Airplanes cheat.
 
But at the end of the day all that matters is that they achieve the same goal. If what you want to do is fly, you can go with flapping wings, fixed wings, rotary wings (helicopter), or whatever. As long as it achieves the purpose in a satisfactory manner, that's really all that matters.

And also just to make it clear: we CAN make something fly by flapping wings. It's called an "Ornithopter". It's just not very efficient. We used fixed wings instead because it works better, not because of some inability to do it the nature way.

Anyway, nobody would say, "ah, but it doesn't count as flight if you don't have literal muscles reacting to literally chemical signals, pulling the wings up and down to flap them." Why? Because it would be the most daft objection imaginable.

Yet basically here we have the exact same kind of objection about thought. 'Ah, nah, see, you can't make it really think, because it doesn't react to testosterone.' Seriously, wth?
 
But at the end of the day all that matters is that they achieve the same goal. If what you want to do is fly, you can go with flapping wings, fixed wings, rotary wings (helicopter), or whatever. As long as it achieves the purpose in a satisfactory manner, that's really all that matters.

And also just to make it clear: we CAN make something fly by flapping wings. It's called an "Ornithopter". It's just not very efficient. We used fixed wings instead because it works better, not because of some inability to do it the nature way.

Anyway, nobody would say, "ah, but it doesn't count as flight if you don't have literal muscles reacting to literally chemical signals, pulling the wings up and down to flap them." Why? Because it would be the most daft objection imaginable.

Yet basically here we have the exact same kind of objection about thought. 'Ah, nah, see, you can't make it really think, because it doesn't react to testosterone.' Seriously, wth?

Because flight and thought are not really analogous.
 
Because flight and thought are not really analogous.

Flight and thought are very different things. I was making a simple point: some things are physically impossible: for instance faster than light travel. But seeing a thing physically happening demonstrates that it's not impossible. For instance, the sun undergoes fusion, therefore fusion is possible. Birds are capable of flight, therefore flight is possible. Humans are capable of thought, therefore thought is possible.

There was literally nothing else being compared here. The argument is simply that if a thing is being done, that demonstrates that it's physically possible. That humans are capable of thought demonstrates that thought is physically possible.

As I said earlier: to what degree any physical system capable of thought will necessarily resemble the only systems that we know are capable of thought (animal brains) is an open question. I gave some arguments for why I think that many characteristics of brains aren't necessary for thought (because there are different constraints), but I do suspect that some of the characteristics are necessary (I gave the example of the general size scale).
 
But at the end of the day all that matters is that they achieve the same goal. If what you want to do is fly, you can go with flapping wings, fixed wings, rotary wings (helicopter), or whatever. As long as it achieves the purpose in a satisfactory manner, that's really all that matters.

And also just to make it clear: we CAN make something fly by flapping wings. It's called an "Ornithopter". It's just not very efficient. We used fixed wings instead because it works better, not because of some inability to do it the nature way.

Anyway, nobody would say, "ah, but it doesn't count as flight if you don't have literal muscles reacting to literally chemical signals, pulling the wings up and down to flap them." Why? Because it would be the most daft objection imaginable.

Yet basically here we have the exact same kind of objection about thought. 'Ah, nah, see, you can't make it really think, because it doesn't react to testosterone.' Seriously, wth?

Well said.
 
Flight and thought are very different things. I was making a simple point: some things are physically impossible: for instance faster than light travel. But seeing a thing physically happening demonstrates that it's not impossible. For instance, the sun undergoes fusion, therefore fusion is possible. Birds are capable of flight, therefore flight is possible. Humans are capable of thought, therefore thought is possible.

There was literally nothing else being compared here. The argument is simply that if a thing is being done, that demonstrates that it's physically possible. That humans are capable of thought demonstrates that thought is physically possible.

As I said earlier: to what degree any physical system capable of thought will necessarily resemble the only systems that we know are capable of thought (animal brains) is an open question. I gave some arguments for why I think that many characteristics of brains aren't necessary for thought (because there are different constraints), but I do suspect that some of the characteristics are necessary (I gave the example of the general size scale).

You're just selecting what you think is possible there. You declare fusion is possible because the sun does it, true enough, but that doesn't mean humans can build a cold fusion reactor. You declare thought possible because humans do it, but that doesn't mean computers can. The things you cite as examples of the possible are not proof that any conceivable mechanism can accomplish them. Mariah Carey can sing five octaves or whatever so it's possible, but I can't do that.
 
You're just selecting what you think is possible there. You declare fusion is possible because the sun does it, true enough, but that doesn't mean humans can build a cold fusion reactor. You declare thought possible because humans do it, but that doesn't mean computers can. The things you cite as examples of the possible are not proof that any conceivable mechanism can accomplish them. Mariah Carey can sing five octaves or whatever so it's possible, but I can't do that.

Agreed. Did you notice this part of my post:

As I said earlier: to what degree any physical system capable of thought will necessarily resemble the only systems that we know are capable of thought (animal brains) is an open question. I gave some arguments for why I think that many characteristics of brains aren't necessary for thought (because there are different constraints), but I do suspect that some of the characteristics are necessary (I gave the example of the general size scale).

I tend to think that computers will be capable of thought because I think human brains are turing machines. But I agree with you that that's not yet clear: we need to learn more about brains before that is demonstrated. That doesn't mean that we won't be able to build systems that function in the same way as human brains though, but it may mean that thinking machines are further off than many of us think.

Do you disagree with that?

ETA Just to be clear, I understand that you disagree that computers will be capable of thought and as I said I think that's a valid opinion. My view on that is an opinion that while I consider well founded I don't think it's completely known yet. What I have been arguing is that it will be possible to build some sort of system that is capable of thought, even if it may function differently than computers.
 
Last edited:

Back
Top Bottom