As much as people maligned it, the study cited here got things absolutely correct - sites like Facebook, Twitter, and Youtube, by design, don't merely allow these echo chambers, they funnel people into these chambers. It's not at all difficult to go from some random topic, to Dave Rubin or Sargon of Akkad, to Milo, to Richard Spencer.
That's the big problem, the "You might like this" algorithms. Automatically suggesting and directing to similar or related content. Great if you're an advertiser looking to generate views, but it's also extremely effective at sucking people into echo chambers. And the white nationalists understand this, and use it to recruit. There is a hell of a lot of borderline white supremacist stuff out there, probably mostly just solo Angry White Men ranting, with some deliberately-created content to take advantage of the algorithms. But there are a whole lot of steps along the way that come up.
I've seen a lot of this kind of thing with my own feeds on Facebook, and particularly Youtube. I look at one video on swords, and soon I have a half-dozen more suggested to me. I look at one music video by a lame Brit-pop band for the Forumvision contest in Community, and I have a bunch of videos for that artist and similar Brit-pop bands popping up.
I click through and watch a few more of those weapon videos, and within a week, there is literally
nothing else on my Recommended feed. Not just swords and other medieval weaponry, but historical and modern firearms as well, and a small handful other medieval and historical themed videos.
The Brit-pop stuff stops showing up eventually, because I don't click on any of it, but it takes weeks, or sometimes months, before it disappears entirely. And if I click on just one of them, the number of similar videos showing up in my feed doubles.
I see the same thing on other social media sites. I click one artist whose work I like, or which looks moderately interesting, and suddenly I get a dozen similar artists recommended.
It's not hard to see how someone could be driven farther and farther into right-wing echo chambers by a mechanism originally intended to drive advertising revenues. It may start out comparatively innocuous, just some dudebro semi-humorously ranting about how awful his girlfriend is and how a black woman was given the promotion the dudebro so obviously deserved more than she did; but it quickly snowballs from there into garden-variety anti-feminism and complaints about the "reverse racism" of Affirmative Action; into reactionary "women should be subservient to men", "job-stealing immigrants", and welfare queens"; to "Mexican rapists invading America", "violent black thugs raping white women and murdering police officers", "queers recruiting children", and PUA culture; and ending up in "white genocide" and "forced abortion", various anti-Semitic and anti-immigrant conspiracy theories, full-on white nationalism, Incelism and violent misogyny, forced-conversion, and "Pizzagate" sexual abuse conspiracy theories.
It's not so much a "slippery slope" as it is a "shoved forcefully in that direction" process.
I've seen too many of my family, and now-former friends and acquaintances, end up down those rabbit holes. I saw it happening in the churches I attended growing up, and I can see it accelerating now as the technology makes it far easier to disseminate propaganda to sympathetic and confused ears.
But in any case, one can simply let autoplay go and quickly end up dealing with some real genocidal ideas.
Exactly. There's a very good reason that I disabled Autoplay on my web-browser-based Youtube feeds. Unfortunately, there doesn't seem to be any way to turn it off on my Roku Youtube app, so some pretty awful stuff pops up from time to time if I'm not paying attention. And each video that plays counts as a view, and generates recommendations for more similar stuff (fortunately, my Roku Youtube app/account is used primarily for music and My Little Pony fan videos).
These platforms are
deliberately designed to create these sorts of echo chambers, because echo chambers drive views very effectively, more so than diversity of content; and these companies care far more about money than they do about their social impact. The only reason they're making
any noise about removing the worst of these echo chambers (without changing the mechanism that drives them), is not because of any social "wokeness" or sense of responsibility, but purely out of pragmatic self-interest, to stave off the threat of government regulation such as we already see in Europe.