I’ve been on Substack for a bit, mostly to follow Nate Silver. There are some interesting folks on there, and also a lot of complete dipshits. But recently like 75% of the stuff showing up in my feed is frothing right-wing conspiracy bullshit. Are most of the people on Substack literal white supremacists, or have I just wandered down the wrong alley of the algorithm somehow, and if so how do I get back to the land of sanity?
Substack is in the process of being enshittified by extremists just like every other social media platform. I give it another six months before honest contributors start migrating to yet another platform. It’s the archetypical example of “Why we can’t have nice things.”
Stranger
I’m guessing the latter, because I have not seen this.
(I joined Substack to subscribe to Paul Krugman’s economic articles after he left the New York Times.)
Substack’s Nazi problem goes back a while.
2023:
2024:
Etc etc etc.
The TLDR if you don’t want to click through: The site will host basically any kind of writing as long as it isn’t explicitly illegal (incitement, bomb instructions, yada yada). The Nazis discovered this, and quietly migrated en masse.
Nobody really noticed, until the automated promotional algorithm started surfacing their content in “recommended blog” digest messages. Substack themselves aren’t, like, deliberately choosing to advocate for Nazism (as far as anyone knows), it’s just a consequence of the increasing volume of content combined with the dumb computer system blindly promoting what’s “popular.”
Conclusion? Suddenly it looks like Substack is a Nazi bar, because it kind of is.
The key to having a good experience on social media is to follow a strictly curated set of accounts, and never look at the comments.
Which is exactly what social media providers try to assiduously prevent you from doing by pushing links in your face and driving engagement by creating outrage. Twitter users became inundated with unavoidable cruft even before Elon Musk took it over and turned it into a fascist propaganda machine because it was always a smoldering dumpster fire just waiting for someone to dump in a bottle of acetone.
Stranger
I’m on Twitter, and I have to say I see none of that stuff. Because I only follow airlines, airports, musicians (mostly non-Americans) and music venues. They never post anything about politics.
Which isn’t why most people are on Twitter/X.
Stranger
Maybe the algorithms don’t think you’re a Nazi. Maybe they just think you should be.
I’ll be honest: I originally got on Twitter because it was the fastest way to get updates on where my favorite musicians are touring. It’s still a very good place for that. Tour announcements generally get posted there first.
“The algorithms” aren’t about catering to you. They’re about delivering you to their advertisers and backers with your head on a platter.
You are basically useless to Twitter because they can’t sell you anything or use you to generate advertising revenue, which is a great way to be if you can manage it. But most people are on social media to be ‘quasi-social’ (follow and gain followers), which provides a mechanism by which they can be manipulated for engagement.
Stranger
At one point I was getting a lot of of anti-woke youtubers recommendation. I think it’s because I watching video gaming and history based videos and “The Algorithm” thinks these are related interests.
So they say that these algorithms will feed you back what you think that you engage with. So if you engage with left wing content, it will give you more and more extreme left wing content, and the same for any other issue.
However, I believe that this is a lie, inaccurate, or at least incomplete. The reason I say this is that I’m open to “left wing content” but never get it in my algorithms. But if I watch 9 seconds of one Jordan Peterson video, youtube will be harassing me for the next 6 months “hey, you want this Ben Shapiro video? How about more Jordan Peterson?” Half my feed is right wing bullshit because I accidentally opened one once and watched a few seconds of it.
These algorithms are far more eager to recommend right wing content than left wing content. Now, that may or may not be deliberate – it may not be a plot built into the algorithm on purpose. Maybe they just find that if you show a left wing person left wing content, they engage with it 5% of the time, but if you show a right wing person right wing content they engage with it 80% of the time, and they come out ahead just by showing everyone right wing content. That would be the “innocent” explanation. But it is my experience that you are FAR, FAR more likely to be bombarded with right wing content by content recommendation algorithms.
Any time you stumble upon content you don’t want it feeding you more of, make sure you give it a dislike and delete it from your watch history.
I don’t see a Substack feed and would feel free to ignore it if I did.
Generally, I read Substack articles that are linked on other sites or that I find through searches, from contributors like Unbiased Science, Your Local Epidemiologist, Dr. Paul Offit etc.
Maybe I’m denied the pleasure of being “fed” far-right articles since most of my access to Substack is as an author.
There’s a huge amount of far-right and even neo-Nazi glurge on Twitter, which is mostly avoided if you only follow people you respect, in which case the crazy stuff is only revealed when someone you follow reacts to it.
Also, your real name may contain an ümlaut.