• You may find search is unavailable for a little while. Trying to fix a problem.

Why does James Randi endorse the Singularity Summit?

Humes fork

Banned
Joined
Jul 9, 2011
Messages
3,358
This one is awesome! I realize people may truly be suffering psychological issues here, and that is not to be taken lightly. The above link takes that quite seriously though and I think it addresses the matter quite well. Plus it's really silly to fear someone torturing some future simulation of yourself. That sounds like something some Star Trek writer invented just in order to carry a plot line.

Anyway, I see your point. This seems like some awfully intelligent people putting their talent mostly to waste. Remember, intelligence is not wisdom.
 
This one is awesome! I realize people may truly be suffering psychological issues here, and that is not to be taken lightly. The above link takes that quite seriously though and I think it addresses the matter quite well. Plus it's really silly to fear someone torturing some future simulation of yourself. That sounds like something some Star Trek writer invented just in order to carry a plot line.

Anyway, I see your point. This seems like some awfully intelligent people putting their talent mostly to waste. Remember, intelligence is not wisdom.

The very fact that they try hardly to censor it out contributes to that it is much more known than it would otherwise be.

Perhaps I'm missing something, but it find it rather stupid to worry about getting tortured by a future godlike AI. And people literally having nightmares about it? Geez...

Anyways, I would consider it a fraudulent organization. They claim to be uniquely positioned to save humanity (and you can help them save humanity by sending them money!), yet they have almost no interaction with the scientific/scholarly communities that research what they prattle about. How anyone can take this group seriously is beyond me.
 
Should he talk about creationism on a creationist conference?
If he finds it an interesting subject and worth talking about, why not?

Now as it turns out, Randi is unlikely to find that particular subject interesting. What are the differences between creationism and the singularity that might cause him to endorse one, but not the other?
 
The very fact that they try hardly to censor it out contributes to that it is much more known than it would otherwise be.

Perhaps I'm missing something, but it find it rather stupid to worry about getting tortured by a future godlike AI. And people literally having nightmares about it? Geez...

Anyways, I would consider it a fraudulent organization. They claim to be uniquely positioned to save humanity (and you can help them save humanity by sending them money!), yet they have almost no interaction with the scientific/scholarly communities that research what they prattle about. How anyone can take this group seriously is beyond me.


Probably would be worth asking Randi's take on it. he seems quite an adamant skeptic but it always in your best interests to ask.

I believe that no matter how sincere the person may seem it is important to question anything that appears out of place.

I learnt that lesson the hard way.

If you wanted to draft an email to Randi I would be happy to help.
 
I think I figured out what he meant by Bayes.

The relevant part of Bayes' theorem implies that if the same evidence is consistent with more than one hypothesis, the hypothesis with the highest prior probability is more likely.

Solomonoff induction implies that the theory which is simpler has a higher prior probability. Kolmogorov complexity measures simplicity by the length of the description.

He thinks that many-worlds is shorter to describe than quantum collapse. Combining all of this, you should believe many worlds.

This is wrong because Kolmogorov complexity is relative to a language and can't say that one theory is simpler than another in an absolute sense. It's also very Rube Goldberg-ish--it's like saying that you should buy the loaf of bread that's on sale by deciding that the cheaper loaf of bread leaves you with more cash in your pocket, and you need to compute the utility function of money to determine that having more money in your pocket is a desirable thing, and by appropriate application of Aristotlean logic you decide that if something that leaves you with more cash is desirable and cheaper bread leaves you with more cash, then cheaper bread is desirable.

(Edit: removed reference to Kolmogorov complexity being used to compute outputs. Not as relevant as I thought.)
 
Last edited:
I think I figured out what he meant by Bayes.

The relevant part of Bayes' theorem implies that if the same evidence is consistent with more than one hypothesis, the hypothesis with the highest prior probability is more likely.

Solomonoff induction implies that the theory which is simpler has a higher prior probability. Kolmogorov complexity measures simplicity by the length of the description.

He thinks that many-worlds is shorter to describe than quantum collapse. Combining all of this, you should believe many worlds.

This is wrong because Kolmogorov complexity is relative to a language and can't say that one theory is simpler than another in an absolute sense. It's also very Rube Goldberg-ish--it's like saying that you should buy the loaf of bread that's on sale by deciding that the cheaper loaf of bread leaves you with more cash in your pocket, and you need to compute the utility function of money to determine that having more money in your pocket is a desirable thing, and by appropriate application of Aristotlean logic you decide that if something that leaves you with more cash is desirable and cheaper bread leaves you with more cash, then cheaper bread is desirable.

(Edit: removed reference to Kolmogorov complexity being used to compute outputs. Not as relevant as I thought.)

Well from what I understand the interpretations of quantum mechanics is the same no matter if you favor MWI, Copenhagen or shut-up-and-calculate, that is, they are mathematically and empirically identical. That's why they are interpretations intended to explain QM in human terms, rather than hypotheses.

There is a hillarious comment at another forum about these guys, by someone from what I understand works with AI for a living (real AI, not in a crackpot way):

Imagine there was an online community that pretended they had your job.

Pretend there is a website of trans-accountants who have never had an accounting job nor had any education in accounting. They talk about accounting all the time but they make up words for it and misuse what few words they actually know. Everything they know about accounting they learned from movies and adventure novels with accountants. They talk about post-ledger accounting and they talk about maximizing your redline value returns.

Or people who pretend to manage video rental stores but have never even owned a television.

...

I thought it would be fun to find one of Eliezer Yudkowsky's AI algorithms or theories and just rip it apart, but it turns out he doesn't have any theories or ideas of any kind. All he has is endless mental masturbation about how AI needs to be beneficial to mankind and some incredibly narcissistic bragging about how even though he hasn't come up with a single solution to any kind of problem, he's the right kind of person to do it.

It would be interesting to see someone grounded in science and philosophy go through their writings, particularly "the sequences" (a pretentious way of saying essay) and examine them. But the length is prohibitive (according to RW "the sequences" are longer than the LOTR triology).
 
Last edited:
Probably would be worth asking Randi's take on it. he seems quite an adamant skeptic but it always in your best interests to ask.

I believe that no matter how sincere the person may seem it is important to question anything that appears out of place.

I learnt that lesson the hard way.

If you wanted to draft an email to Randi I would be happy to help.

Possibly. I'd be interested to know his take on it. When it comes to this group they do have an Internet presence and certain devote cultists, but they are too small to do much damage and hence be particularly noticable.
 
If he finds it an interesting subject and worth talking about, why not?

Now as it turns out, Randi is unlikely to find that particular subject interesting. What are the differences between creationism and the singularity that might cause him to endorse one, but not the other?

Uhh, is there some catch here that I'm missing?
 
Back
Top Bottom