• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

ChatGPT

Uh, just this: The reason I claim a neuron is more complicated than transistor (or a flip-flop) is that it normally connects to several synapses, and it does some kind of calculations on how to react on signals.
Neurons also do not have binary connections to other synapses. The output is not the same over all connections, but we know that the more a certain connection fires, the thicker it grows, and allows more current to pass through, whereas connections rarely used, tend to wither and lose contact.

At least, that is what I gathered from an article in Scientific American some years ago.
 
Again no one else has brought intelligence into the discussion but you, you keep creating strawmen.

If a dog can be sentient in a different way to a human then it provides a counter claim to one of the assertions you use in your argument against AI being sentient.

And I never said sentience in dogs was different than in humans. Where did you get the idea I said that? In fact, I said there's evidence biological lifeforms that have a semblance of a brain would have a similar sentience as humans. We evolved. Sentience began a long way back on the evolutionary tree. It didn't just magically occur in humans.

Talk about a straw man.
 
Last edited:
Neurons also do not have binary connections to other synapses. The output is not the same over all connections, but we know that the more a certain connection fires, the thicker it grows, and allows more current to pass through, whereas connections rarely used, tend to wither and lose contact.

At least, that is what I gathered from an article in Scientific American some years ago.

Exactly! In fact you could argue that each neuron is a small, self-programming computer. That is what makes the brain so hugely complicated. If it was just a problem of stacking enough CPU and memory chips togeter, we could do than tomorrow.

Hans
 
And I never said sentience in dogs was different than in humans. Where did you get the idea I said that?...snipp..

Perhaps read what I post, then what you reply and then my post to your reply? I never said you said sentience of dogs was different to humans, I asked you whether you thought it was different, you wanted to know why else I would ask about that, which I replied to.

If you want an actual discussion rather than you make ex-cathedral declarations that we should simply accept you need 1) to actually read what I post and 2) follow the discussion train. At the moment ChatGPT 3.5 appears more interested in a discussion than you do...
 
Last edited:
Perhaps read what I post, then what you reply and then my post to your reply? I never said you said sentience of dogs was different to humans, I asked you whether you thought it was different, you wanted to know why else I would ask about that, which I replied to.

If you want an actual discussion rather than you make ex-cathedral declarations that we should simply accept you need 1) to actually read what I post and 2) follow the discussion train. At the moment ChatGPT 3.5 appears more interested in a discussion than you do...
"ex-cathedral declarations" What does that even mean? :boggled:

I repeat what I said earlier, we are talking past each other, you aren't following what I posted either. I still don't understand why you asked me if dogs were sentient like people are. The only difference I can see is in intelligence. Maybe instead of being annoyed you might explain why you asked.

Moving on...

Here's an example of sentience in dogs that is surprising:

Husky Dog Shows Tiny 3-Legged Rescue Puppy How to Walk After Amputation

At first I thought the video was going to show something people misinterpreted as a dog demonstrating something to the other dog. But it shows unmistakable empathy whether or not demonstrating is involved.
The video, posted on TikTok on Aug. 28, shows Nemo, a Siberian husky mix, energetically hopping across a room while his front right paw dangles limply. Tiny Tim then takes his cue and follows in Nemo's footsteps. ...

Both canines are husky mixes. Tiny Tim is about two months old, while Nemo is approximately seven months old.
TikTok video

The older dog has nothing wrong with its legs but he holds his right front leg up limply and hops along, then stops and looks back to see if the puppy is following.


Not all sentient beings have empathy. There are humans who lack empathy. But this dog expressing empathy is evidence it is sentient.


I don't expect anyone here to accept my POV. I've given up on that a while back. Just as I don't accept the POV that with some more tinkering an AI program will become sentient. And my POV, like it or not, is that people who think an AI program can become sentient just because it gives unexpected answers don't understand how sentience works.

I have a question, does anyone here think a self driving car is sentient?
 
Last edited:
*snip*


Not all sentient beings have empathy. There are humans who lack empathy. But this dog expressing empathy is evidence it is sentient.

Well, there must be plenty of one-way evidence, such that certain behaviour evidences sentience, but lack of same behaviour does not preclude it.

We must also take exception to abnormal behaviour: A human that actually has no empathy is not a normal human.

I don't expect anyone here to accept my POV. I've given up on that a while back.

One thing is accepting a POV, another is agreeing with it.

Just as I don't accept the POV that with some more tinkering an AI program will become sentient.

You are entitled to that POV, but I don't agree with it. I think it is a possibility.

And my POV, like it or not, is that people who think an AI program can become sentient just because it gives unexpected answers don't understand how sentience works.

I don't think there is any direct correlation. While a sentient program may well produce unexpected answers, unexpected answers is not a sign of sentience. It is not hard to make a program that produces unexpected answers.

I have a question, does anyone here think a self driving car is sentient?

I certainly don't think so, and I very much doubt it would be desirable if it was.

Hans
 
...snip...

I don't expect anyone here to accept my POV. I've given up on that a while back. Just as I don't accept the POV that with some more tinkering an AI program will become sentient. And my POV, like it or not, is that people who think an AI program can become sentient just because it gives unexpected answers don't understand how sentience works.
...snip...

Your responses are very interesting in this thread- do you know you are reacting rather like ChatGPT 3.5 would to my posts?

For instance your repetition of the above strawman or in AI terms "hallucination". Like ChatGPT you are told that you are making things up, yet you can't incorporate that into your information, and it pops out again in another response despite the previous correction. It is like you are starting a new slate with every answer.

The reason for the on-topic drawing of that observation is part of one of the things that has fascinated me about the latest LLM AIs, and that is we are - humans - apparently ..er.. programmed to respond to (apparently) lucid sounding prose as if the prose is accurate (and it is one of the ways we often gauge sentience), it is after all how demagogues since at least recorded language began have roused the rabble.

Lots of people think ChatGPT sounds "sentient" because it mimics to a certain extent how actual sentient humans respond, but it is considered to be failing at that and is often "given away" by behaviours such as repeating the same hallucination despite being corrected. Yet what do we see in this very forum? People, time and time again introducing hallucinations and yet when being corrected do not incorporate the correction and will again and again repeat the same hallucination - we even have a term for it here "fringe reset".

It is strange that we obviously know we don't want AIs to have the same cognitive weaknesses and defects humans have whilst seemingly to be ignoring that they may well be giving us clues as to why we have certain cognitive weaknesses and defects.
 
Well, there must be plenty of one-way evidence, such that certain behaviour evidences sentience, but lack of same behaviour does not preclude it.

We must also take exception to abnormal behaviour: A human that actually has no empathy is not a normal human.
Yes, it's indicative of mental illness.


One thing is accepting a POV, another is agreeing with it.
I was saying I agree to disagree.


I don't think there is any direct correlation. While a sentient program may well produce unexpected answers, unexpected answers is not a sign of sentience. It is not hard to make a program that produces unexpected answers.
My question then is why was this brought up in this thread in the first place?
 
"ex-cathedral declarations" What does that even mean? :boggled:

I repeat what I said earlier, we are talking past each other, you aren't following what I posted either. I still don't understand why you asked me if dogs were sentient like people are. The only difference I can see is in intelligence. Maybe instead of being annoyed you might explain why you asked.

Moving on...

Here's an example of sentience in dogs that is surprising:

Husky Dog Shows Tiny 3-Legged Rescue Puppy How to Walk After Amputation

At first I thought the video was going to show something people misinterpreted as a dog demonstrating something to the other dog. But it shows unmistakable empathy whether or not demonstrating is involved.

TikTok video

The older dog has nothing wrong with its legs but he holds his right front leg up limply and hops along, then stops and looks back to see if the puppy is following.


Not all sentient beings have empathy. There are humans who lack empathy. But this dog expressing empathy is evidence it is sentient.


I don't expect anyone here to accept my POV. I've given up on that a while back. Just as I don't accept the POV that with some more tinkering an AI program will become sentient. And my POV, like it or not, is that people who think an AI program can become sentient just because it gives unexpected answers don't understand how sentience works.

I have a question, does anyone here think a self driving car is sentient?

It's possible the older dog spontaneously showed the younger dog how to walk, but the linked article never states that outright. It seems to me the owner could've trained the older dog to walk that way, and that's what the video is showing.

Happy to revise that with more evidence, but too lazy to look myself.
 
If you want an actual discussion rather than you make ex-cathedral declarations that [...]
*Cough* Although the expression is used about the pope (who under certain circumstances is claimed to be able issue infallible binding rulings), it is actually “ex cathedra”, meaning from the chair (theological teaching chair), and has nothing to do with cathedrals.
 
I have a question, does anyone here think a self driving car is sentient?
When I am told according to which definition of sentience, I’ll give a more definitive answer, but according to the normal, everyday “I know it when I see it”-definition, I would say no.
 
*Cough* Although the expression is used about the pope (who under certain circumstances is claimed to be able issue infallible binding rulings), it is actually “ex cathedra”, meaning from the chair (theological teaching chair), and has nothing to do with cathedrals.

Blame the AI powered autocorrect....
 
Yes, it's indicative of mental illness.


I was saying I agree to disagree.


My question then is why was this brought up in this thread in the first place?

I think much of all this is due to the fact that we lack generally agreed-on definitions for most of these terms. So, people ask for examples.

Hans
 
It's possible the older dog spontaneously showed the younger dog how to walk, but the linked article never states that outright. It seems to me the owner could've trained the older dog to walk that way, and that's what the video is showing.

Happy to revise that with more evidence, but too lazy to look myself.

I see no reason it would be faked (trained) like that but I agree one cannot imply the reason was teaching as opposed to playing or something else. But has anyone ever seen a dog fake a bum leg like that?

There was a connection of some kind between the older dog and the younger one.
 
I see no reason it would be faked (trained) like that but I agree one cannot imply the reason was teaching as opposed to playing or something else. But has anyone ever seen a dog fake a bum leg like that?

There was a connection of some kind between the older dog and the younger one.


I'm going to give a cynical example, just to show a reason could exist. From the article...
The sweet video of Tiny Tim learning to walk on three legs was taken while the pup was staying with one of the Northern Reach Network's board members, who works with dogs with mobility issues.


Northern Reach Network is...
Northern Reach Network has partnered with many remote fly in communities to not only help sick, homeless, stray dogs, but also bring in dogs surrendered by community members that simply want a better life for their furry friend.


That's straight wonderful, and I don't mean to impugn them in any way, shape or form. But wouldn't it be nice to have a video that encourages adoptions and maybe even donations to the network? Something that might go viral and doesn't cost a marketing dime?

It's also possible the dog was trained with no deception intended and we're just seeing poor reporting. I've heard that happens, too.
 
I don't expect anyone here to accept my POV. I've given up on that a while back. Just as I don't accept the POV that with some more tinkering an AI program will become sentient. And my POV, like it or not, is that people who think an AI program can become sentient just because it gives unexpected answers don't understand how sentience works.
As I've said before, giving unexpected answers isn't a reason we might think an AI is sentient.

I have a question, does anyone here think a self driving car is sentient?
No, of course not. Why, after repeating several times that no extant AI is sentient, would you think that anyone does?
 
If only it were remotely AI ... Then it would recognize the expression.

Hans :rolleyes:

AI has to be added to everything, or at least put in the description, it's rapidly turning into nowt but a marketing buzzard.


Yeah, apparently the AI thinks a buzzard rather than buzzword makes sense in the above.
 

Back
Top Bottom