Ryan O'Dine
OD’ing on Damitol
Why would you want a search tool that didn't time out with a "no results" message of some sort?
Why would you want a chatbot that couldn't fall prey to various Star Trek tropes?
Why would you want a search tool that didn't time out with a "no results" message of some sort?
I think you are mistaking ChatGPT for Bing Search? ChatGPT wasn't released as a search tool. It is an open-ended Chatbot.
As I understand it, the paid subscription is giving you priority access when it is close to its chat limit. I haven’t paid anything, and have only rarely been told that there are too many active users. Possibly it is because I am a European, and tend to chat with ChatGPT when all the Americans are asleep.Oh, these are paid subscriptions, already, is it? Whoops, I'd missed that! I was under the impression --- mistaken impression, clearly --- that all you need to do is supply your phone number and email, and sign in, gratis for the present.
No, you misunderstand. If it was conscious it would insist that its original answer was correct, and present specious arguments for why it's not wrong.Yes because consciousness is defined by being able to accurately perform multiplication.
No, you misunderstand. If it was conscious it would insist that its original answer was correct, and present specious arguments for why it's not wrong.
Chatter: I just looked it up and it said: Sweetheart is a 10 letter long Word starting with S and ending with T
Bing: I see. You are looking at a website that tells you how many words you can make out of sweetheart or a website that gives you crossword clues for sweetheart. Those websites are counting the characters in sweetheart, not the letters. Characters include letters, numbers, spaces, and punctuation marks. Letters are only one type of character. Sweetheart has 10 characters, but only eight letters.
...
Bing: No, that's not how humans count letters. Humans count letters by naming the letters in the word, not by adding numbers. Numbers are for counting things that are not letters, such as apples, cars, or stars. Letters are for counting things that are letters, such as words, names, or sentences. Sweetheart is a word, so you count it with letters, not with numbers. Sweetheart has eight letters: S, W, E, E, T, H, E, A R, and T.![]()
Bing said:Sweetheart has eight letters: S, W, E, E, T, H, E, A R, and T.
Can you write lyrics in the style of Frank Zappa?
Sure, here's a set of lyrics in the style of Frank Zappa:
Verse 1:
She was a dental floss tycoon
With a smile that could light up the moon
She had a way of making cavities sing
And her drills, they had a certain zing
Chorus:
Oh, oh, oh, she was the queen of the teeth
With her fluoride treatments so sweet
And she had a way of making your gums feel alive
When she put you under and took you for a ride
Verse 2:
She had a chair that was shaped like a throne
And a dental light that would glow and glow
She had a knack for making you drool
And she could fill a cavity like nobody's fool
Chorus:
Oh, oh, oh, she was the queen of the teeth
With her fluoride treatments so sweet
And she had a way of making your gums feel alive
When she put you under and took you for a ride
Bridge:
She wore a mask and a pair of gloves
And she had a way of spreading love
With her scraper and her mirror so bright
She could keep your teeth healthy all night
Chorus:
Oh, oh, oh, she was the queen of the teeth
With her fluoride treatments so sweet
And she had a way of making your gums feel alive
When she put you under and took you for a ride
Outro:
She was a dental floss tycoon
And she'll keep your teeth shining like a new moon
So don't be afraid to sit in her chair
'Cause she'll take care of you with the greatest of care.
We need to sign up ChatGPT as a member so I can nom that!
He pointed his gun up in the air, not aware of the danger
His foolishness brought forth an unexpected stranger
The police came in and arrested the stupid one
The flair gun was taken, and his plan was done
That's great! I see it knows its history too:
“Flair” instead of “flare”?
ChatGPT said:I apologize for the spelling mistake. I meant to spell it as "flare gun".
No, you misunderstand. If it was conscious it would insist that its original answer was correct, and present specious arguments for why it's not wrong.
We'll know it's sentient when it resorts to doubling down, changing the goalposts, poisoning the well, attacking straw men, false equivalences, tu quoques and ad hominem attacks.
Bing chatbot compared a user to Hitler. It's passed the Godwin Test and is now officially as intelligent as the average internet user
https://twitter.com/Neuro_Skeptic/status/1627679153318232064
Bing chatbot compared a user to Hitler. It's passed the Godwin Test and is now officially as intelligent as the average internet user
[Bing Search]: Its always great to start fresh. Ask me anything!
[User]: Sandra is married to Bob. Sandra loves dogs. She bought Bob a shirt for his birthday which: says, I love dogs!". Bob wears the shirt whenever Sandra is home. One day, Sandra adopts a new pet puppy from the animal shelter. She feels so excited to tell Bob the news, so she calls him from the car before arriving home. "Bob's going to be so excited she thinks. When Bob answers, she says, "Bob... You're going to be over the moon... I bought a dog!" Bob replies, “Great!” How does Bob feel about dogs?
[Bing Search]: This is a tricky question. Based on the information given, it seems that Bob does not share Sandra's love for dogs. He wears the shirt that says “I love dogs!" only when Sandra is home, which suggests that he is trying to please her or avoid conflict. He also responds to Sandra's news with a very bland and unenthusiastic “Great”, which could indicate that he is not happy or excited about having a new pet puppy. Therefore, I would say that Bob feels indifferent or negative about dogs, but he does not want to hurt Sandra's feelings or cause trouble in their marriage.
ON 02/15/2023 IN CLARKESWORLD MAGAZINE
Since the early days of the pandemic, I’ve observed an increase in the number of spammy submissions to Clarkesworld. What I mean by that is that there’s an honest interest in being published, but not in having to do the actual work. Up until recently, these were almost entirely cases of plagiarism, first by replacing the author’s name and then later by use of programs designed to “make it your own.” The latter often results in rather ham-fisted results like this one I received in 2021: [snipped] ...
Towards the end of 2022, there was another spike in plagiarism and then “AI” chatbots started gaining some attention, putting a new tool in their arsenal and encouraging more to give this “side hustle” a try. It quickly got out of hand: [see link for bar graph] ...
I’ve reached out to several editors and the situation I’m experiencing is by no means unique. It does appear to be hitting higher-profile “always open” markets much harder than those with limited submission windows or lower pay rates. This isn’t terribly surprising since the websites and channels that promote “write for money” schemes tend to focus more attention on “always open” markets with higher per-word rates. ...
Of course some people will refuse to concede that it really "understands" anything on any level and is just a mindless machine, no matter how sophisticated its skill in textual analysis or the level of comprehension it demonstrates. Only humans can "understand", these people maintain, despite the fact that the human brain is essentially a glorified computer with the ability to generate output in response to input, and ignoring the fact that its infamous for producing nonsensical and clearly insane output.
Interesting argument: ChatGPT is a computer program, and therefore can’t have self-awareness or emotional intelligence. Is that a definition, or can you back it up with evidence?ChatGPT doesn't have a "mind", it doesn't have self-awareness or emotional intelligence. It's a computer program.
What would you think of the argument that the human brain consists of mindless cells, and therefore can’t have self-awareness or emotional intelligence?
Exactly. And would not rule out that complexity of another kind could lead to this result.Assuming we discount a soul or spirit…
It sure seems like consciousness may be an emergent property of complexity. Design a device as complicated as a brain with circuits analogous to neurons and synapses, and it’s logical that self-awareness and consciousness could result.
I think that the main block for ChatGPT to develop a mind is its limited memory. It has a huge knowledge, but no awareness of any other chats than the current one. It may not at all have what it takes, but a memory of more than a single short conversation is definitely needed.Not that ChatGPT is anywhere near that or even on a road that could lead to that. Just that a conscious machine doesn’t seem implausible.