It would be an expensive way for the current phishing bods.
A Guardian "Long Read" article on why US educated AI top researcher Song-Chun Zhu left the US to return to China
![]()
‘I have to do it’: Why one of the world’s most brilliant AI scientists left the US for China
In 2020, after spending half his life in the US, Song-Chun Zhu took a one-way ticket to China. Now he might hold the key to who wins the global AI racewww.theguardian.com
That's a relief. It would be terrible if corporate US won the AI wars. Dangerous knowledge is best in hands of Chinese working people.
The biggest indicator of a bubble.Economists are pointing that the $500bn to 1 trillion dollars US tech companies are spending this year on AI infrastructure is singlehandedly responsible for a huge chunk of the total economic growth.
It's not sustainable and
there is no business model to reclaim the costs.The most immediate effect is rising prices for electricity for normal consumers
Another reason to support the Guardian. I send them a few dollars every now and again.I was coming to post that myself - excellent read.
Sadly she's a Youtuber that I think has gone off the rails a tad in regard to science education.Oh, dear. This is like an episode of the Orville.
We have to give up an entire galaxy to see an improvement?
It will really only happen when consoles can do it.The issue in games is what you can do locally and what you have to farm off to a cloud service of some kind. Mentioned it before I know a company that is selling AI tools to game developers that uses a heavily quantised open source LLM that can run locally for conversation generation and text to voice. But running it locally does require a fairly hefty GPU with a proper level of onboard RAM - these latest generation GPUs coming with only 8GB of RAM are a travesty.
People are making mods for games like Fallout 4 that contain entire questlines with all NPCs voiced by AI. On the one hand, this means that quality quest mods are more available than ever, but on the other hand it means that human voice actors aren't getting the work.The issue in games is what you can do locally and what you have to farm off to a cloud service of some kind. Mentioned it before I know a company that is selling AI tools to game developers that uses a heavily quantised open source LLM that can run locally for conversation generation and text to voice. But running it locally does require a fairly hefty GPU with a proper level of onboard RAM - these latest generation GPUs coming with only 8GB of RAM are a travesty.
The AI bubble is 17 times the size of the dot-com frenzy — and four times the subprime bubble, analyst says