• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Cont: Musk buys Twitter II

Starmer has made it pretty clear that it's not just "people in bikinis" but rather the CSAM that is the problem.

But thanks for letting everyone know where you come down on the issue by posting CSAM apologist propaganda.
Yeah, in deciding on how far to go to protect your cult, I'd think people would understand that their blind devotion should stop somewhere short of proclaiming they don't understand the difference between sexually depicting adults in bikinis and doing the same for little children. That's what worries me about the Elonia-MAGA cult members, though, they just appear to be running 100% on autopilot. No matter what their idols do, no matter how absurd, ridiculous, or terrible, they think they have to step in to defend them. Sad.
 
Yeah, in deciding on how far to go to protect your cult, I'd think people would understand that their blind devotion should stop somewhere short of proclaiming they don't understand the difference between sexually depicting adults in bikinis and doing the same for little children. That's what worries me about the Elonia-MAGA cult members, though, they just appear to be running 100% on autopilot. No matter what their idols do, no matter how absurd, ridiculous, or terrible, they think they have to step in to defend them. Sad.

“I’m against CSAM” seems like the easiest layup ever and yet they can’t manage it. Pretty obvious conclusion to draw from that.
 
That is correct.

To get everybody up to speed, Elon Musk's AI chatbot that is integrated into X has been generating CSAM at user's requests and has flooded the platform with it, and Musk doesn't seem too bothered. Nor do the people who've come to this thread to hand-wave it away.

Draw your own conclusions from that.
 
That is correct.

To get everybody up to speed, Elon Musk's AI chatbot that is integrated into X has been generating CSAM at user's requests and has flooded the platform with it, and Musk doesn't seem too bothered. Nor do the people who've come to this thread to hand-wave it away.

Draw your own conclusions from that.
I recently complained on this forum about the prevalence of porn on X.
I count myself lucky not to have this sort show up while browsing.
 
I recently complained on this forum about the prevalence of porn on X.
I count myself lucky not to have this sort show up while browsing.
Not sure where you are in the world but you should be very careful. In the UK, for example, it is a serious criminal offence to have any such material, that it was thrown up by the system with no input by you cannot be used as a defence, it's what is often described as a "strict liability" crime, your intent has no bearing on if the crime of viewing/possessing has occured.
 
That is correct.

To get everybody up to speed, Elon Musk's AI chatbot that is integrated into X has been generating CSAM at user's requests and has flooded the platform with it, and Musk doesn't seem too bothered. Nor do the people who've come to this thread to hand-wave it away.

Draw your own conclusions from that.
There's no CSAM in my feed, but there is some mildly sexualised material. e.g. women's nipples falling out of their dresses etc.
 
There's no CSAM in my feed, but there is some mildly sexualised material. e.g. women's nipples falling out of their dresses etc.
So far.

Unless X has now taken action to stop X creating and publishing CSAM I would be very wary of using X in any way that you can't 100% control. It sounds like personal "feeds" aren't under a user's total control, that X populates them for you?
 
So far.

Unless X has now taken action to stop X creating and publishing CSAM I would be very wary of using X in any way that you can't 100% control. It sounds like personal "feeds" aren't under a user's total control, that X populates them for you?
I understand the risks, but, frankly, that's the whole internet. If you Google the wrong search term you may get to accidentally see stuff you shouldn't.
 
I understand the risks, but, frankly, that's the whole internet. If you Google the wrong search term you may get to accidentally see stuff you shouldn't.
About a decade ago I bought a motorcycle and my boss told me to get a battery tender. For some reason it stuck in my head as "battery stranger" and that was an awkward as hell google search.
 
About a decade ago I bought a motorcycle and my boss told me to get a battery tender. For some reason it stuck in my head as "battery stranger" and that was an awkward as hell google search.

My rather naive born again Xtian older brother was engaged in an engineering project and came to regret googling 'O-Rings'.
 
I understand the risks, but, frankly, that's the whole internet. If you Google the wrong search term you may get to accidentally see stuff you shouldn't.

You have significantly more control over your Google search results than you do the algorithms that control your Twitter feed.

I’m also not aware of a CSAM-generating program being integrated into Google.
 
Not sure where you are in the world but you should be very careful. In the UK, for example, it is a serious criminal offence to have any such material, that it was thrown up by the system with no input by you cannot be used as a defence, it's what is often described as a "strict liability" crime, your intent has no bearing on if the crime of viewing/possessing has occured.
Frankly it’s got to the stage that I feel like closing most, if not all, of any social accounts I have anyway.
 

Back
Top Bottom