hckrnws
Lovely thought Ben. Good to hear from you!
I spent a lot of my life and money thinking about building better algorithms (over five years).
We have a bit of a chicken / egg problem. Is it the algorithm or is it the preference of the Users which is the problem.
I'd argue the latter.
What I learned which was counter-intuitive was that the vast majority of people aren't interested in thinking hard. This community, in large part, is an exception where many members pride themselves on intellectually challenging material.
That's not the norm. We're not the norm.
My belief that every human was by their nature "curious" and wanting to be engaged deeply was proven false.
This isn't to claim that this is our nature, but when testing with huge populations in the US (specifically), that's not how adults are.
The problem, to me, is deeper and is rooted in our education system and work systems that demand compliance over creativity. Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.
> Is it the algorithm or is it the preference of the Users which is the problem. I'd argue the latter.
> Algorithms serve what Users engage with
User engagement isn't actually the same thing as user preference, even though I think many people and companies take the shortcut of equating the two.
People often engage more with things they actually don't like, and which create negative feelings.
These users might score higher on engagement metrics when fed this content, but actually end up leaving the platform or spending less time there, or would at least answer in a survey question that they don't like some or most of the content they are seeing.
This is a major reason I stopped using Threads many months ago. Their algorithm is great at surfacing posts that make me want to chime in with a correction, or click to see the rest of the truncated story. But that doesn't mean I actually liked that experience.
Thanks for the thoughtful response.
Curious about this. Don't have an angle, just trying to survey your perspective.
You shared: > People often engage more with things they actually don't like, and which create negative feelings.
Do you think this is innate or learned? And, in either case, can it be unlearned.
I think it’s situational.
If you measure which TV shows and movies I watch, that’s a vote of preference.
If you measure which news headlines evoke a comment from me, that’s a measure of engagement but not necessarily preference.
People respond to a lot of things that annoy them, and I think it’s a pretty common human trait. Advertising your business with bright lights and noise can be effective, but we often ban this in our towns and cities because we prefer life without them.
Not OP, but I think most of us have an instinct to correct things that we think are wrong.
I don't think such instincts can be unlearned, but they can be held in check by the realization that naive attempts to fix things can make things worse, including how we feel about ourselves.
This conscious inhibition, however, requires cognitive effort.
> Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.
Algorithms have been adapted; they are successful at their goals. We’ve put some of the smartest people on the planet on this problem for the last 20 years.
Humans are notoriously over-sensitive to threats; we see them where they barely exist, and easily overreact. Modern clickbait excels at presenting mundane information as threatening. Of course this attracts more attention.
Also, loud noises attract more attention than soft noise. This doesn’t mean that humans prefer an environment full of loud noises.
>This community, in large part, is an exception where many members pride themselves on intellectually challenging material.
That's not the norm. We're not the norm.
I recommend against putting HN on a pedestal. It just leads to disappointment.
It's true -- I do enjoy this community even though it's failed to serve my every thought with the love that I surely deserve!
Odd reply, but OK. For what it's worth I largely agree with everything else you said.
>>The problem, to me, is deeper and is rooted in our education system and work systems that demand compliance over creativity. Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.
Technically that's true. Thing is, the UI/UX isn't built for long-form content. The platform, interface and algorithm when taken as a whole represent more of a dopamine delivery system heavily biased towards short-form content.
That dynamic in turn ends up being deleterious to cognition to the point it ends up fighting any external factors that which could change user behavior for the better.
In other words the algorithm is part of a larger format, and that format is arguably the real drag. Of course, the algorithm being properly transparent and accountable to its users would certainly help.
I think we're relatively aligned. But you're sharing another chicken & egg problem of whether the algorithm (and engagement with it) is driving the design of the feeds or the other way around.
Arguably, the initial design was a shot in the dark and they're approaching some local maxima with data-driven design trying to improve metrics that we probably all agree aren't the best for our mental health or wellbeing.
Nice chat, apologies if my response was off-putting. It was intended to be self-deprecating humor.
>Arguably, the initial design was a shot in the dark and they're approaching some local maxima with data-driven design trying to improve metrics that we probably all agree aren't the best for our mental health or wellbeing.
Yeah, the way I look at it is product managers and everyone above them in the reporting chain make more money for their respective companies the more they optimize short-form content delivery. Pretty much what you just said.
So, what we're left with is a hyper-optimized content pipeline over the years that's pretty rough to get away from when quite a large number of people are already accustomed and/or addicted. In other words it's really hard to close up Pandora's Box again, but fortunately not impossible.
>Nice chat, apologies if my response was off-putting. It was intended to be self-deprecating humor.
No worries, wasn't sure and didn't want to read into it wrong. Wasn't trying to be snarky on my end. Cheers.
It is not the norm here either.
[dead]
[flagged]
I’d expect this to be down-voted too. It has nothing to do with the article and makes no concrete claims. It’s easy to slag anything in vague terms, but it adds nothing to this discussion.
>Any comment that challenges mainstream science ...
Stupid mainstream science.
>... and leftist politics ...
>... nor do they see that their views are political in nature.
You don't say. Personally, I respect comments that prove their own claims.
> It's an echo chamber here, too... > Any comment that challenges [ whatever else you dislike ] and leftist politics gets downvoted into oblivion.
It's not. Downvoting here is much less of a conversation-breaking action than shadow banning, outright banning and trash-comment-polution on other soc. med. sites. Downvoting has it's place, the occasional abuse of it should not be a problem, it hurts you so much because you want your views to prevail but that's your personal problem, not a site problem.
I get downvoted quite often here too but these days I usually challenge rightist politics - mostly due to it being used as an excuse for some wrong action or inaction at the moment. However, most of the time I have no idea if what I'm objecting to is leftist or rightist, I have to think separately to figure that out... but I don't do it because it's actually hard.
> most people who hold the worldview that is enforced here often cannot see their own presuppositions.
I suppose you think you can see your own presuppositions and I'm ready to challenge that. I'm also ready to challenge the notion that only leftist worldviews are promoted here. The world "enforced" is quite misleading too, some tradition does exist on this site, but it's different and a lot more nuanced than central "leftist" enforcement.
> nor do they see that their views are political in nature.
So are yours... A lot of views are political in nature, some realize it, some not, that fact has no bearing on the viability of those views and bragging about it helps nobody.
I don’t know if I buy the explanation that this was due to the feed algorithm. It looks like an artifact of being exposed to X’s current user base instead of their old followers. When Twitter switched to X there was a noticeable shift in the average political leanings of the platform toward alignment with Musk, as many left-leaning people abandoned the platform for Bluesky, Mastodon, and Threads.
So changing your feed to show popular posts on the platform instead of just your friends’ Tweets would be expected to shift someone’s intake toward the average of the platform.
I'm not sure what your point is. How is "being exposed to X's current user base instead of their old followers" not equivalent to "turning on the feed algorithm"? You doubt the effect is due to the algorithm, but your alternative explanation describes exactly what the algorithm does.
Comment was deleted :(
Is this the result of a feedback loop from musk joining or did they just accelerate the overall decline of the platform with him joining? Some might say it was going this way even before he picked it up, but it was certainly an inflection point when he joined either way.
All modern social media is pretty toxic to society, so I don't participate. Even HN/Reddit is borderline. Nothing is quite as good as the irc and forum culture of the 2000s where everyone was truly anonymous and almost nobody tied any of their worth to what exchanges they had online.
The moderation changes absolutely changed posting behavior. People got banned for even faintly gesturing the wrong direction on many issues and it frightened large accounts into toeing the line.
[dead]
> Even HN/Reddit is borderline.
It's the proliferation of downvoting. It disincentivizes speaking your honest opinion and artificially boosts mass-appeal ragebait.
It's detrimental to having organic conversations.
"But the trolls" they say.
In practice it's widely abused.
Using HN as an example, there are legitimate textbook opinions that will boost your comment to the top, and ones that will quickly sink to the bottom and often be flagged away for disagreement. Ignoring obvious spam which is noise, there is no correlation to "right" or "wrong".
That's one advantage old-school discussion forums and imageboards have. Everyone there and all comments therein are equally shit. No voting with the tribe to reinforce your opinion.
What's worse is social media allowed the mentally ill to congregate and reinforce their own insane opinions with plenty of upvotes, which reinforces their delusions as a form of positive feedback. When we wonder aloud how things have become more radicalized in the last 20 years — that's why. Why blame the users when you built the tools?
I like voting (up and down) but I also agree with your take. Reddit salts the votes, but maybe the solution is to allocate a certain amount of reasonable votes (up or down) total that a user can use weekly. Make it so when you are voting, it's much more meaningful and truely reflect an opinion you either really agree with or really do not agree with.
Ultimately, I think it comes back to people value their online persona way too much and this is something we've intentionally marched towards.
>but maybe the solution is to allocate a certain amount of reasonable votes (up or down) total that a user can use weekly
Slashdot did this back in the day IIRC.
I was just gonna say that!
Yeah, you had to have sufficient rep on Slashdot, then you were randomly allocated a certain number of votes (5 or so, IIRC) that you could use to vote. There were fixed categories you could vote an item for, such as "funny", or "off topic". Once your votes were gone, that was it until you were randomly awarded more. The max score anything could get was 5, and a minimum was -1. You could use the scores to filter what you saw. (ie: show full text of >3 insightful, and summaries of 1-2, hide <1)
It worked pretty well. Obvious trolling was still down voted, and insightful stuff was up voted. The ability to just show a blurb of lower-voted stuff was nice as well; you could ignore obvious crap, but expand it if it caught your attention.
This was good insight. I do think this system would work better. I like the min rep aspect too. The things you've listed would go a long way to preserve the filtration effects of a voting system while possibly mitigating abuse and bot proliferation.
I don't know what changes have been made more recently, but I know there was a definite change to the Twitter algorithm a few months ago that filled the feeds of conservatives with posts from liberals and vice versa. It seemed to be specifically engineered to provoke conflict.
> When Twitter switched to X there was a noticeable shift in the average political leanings of the platform toward alignment with Musk, as many left-leaning people abandoned the platform for Bluesky, Mastodon, and Threads.
Do you have any numbers. In my experience it's still majorly communists
Open the front page in a private tab. Most posts are far right conspiracy theories, ragebait or grifts.
Look at that: https://news.ycombinator.com/item?id=46504404
I deleted my account after many years when X recently made the Chronological Feed setting ephemeral, defaulting back to the Algorithmic Feed each time the page is refreshed.
No away I'm going to let that level of outrage-baiting garbage even so much as flash before my eyes.
I just click "following" at the top and never see anything I didn't ask to see. It resets once every few months to the other tab which I assume is just the cookie setting expiring.
Train it: I just have to spend 3 minutes every other year to tap the 3 dots on every post and choose "Not Interested", for an epic feed unmatched anywhere.
Train the algorithm so that you can be the sort of product you want to see in the world
Oddly enough, X is the only platform i've been able to teach to not show me culture war stuff, from either side. It just shows me AI in the "For You."
I've been pretty consistent about telling Bluesky I want to see less of anything political and also disciplined about not following anybody who talks about Trump or gender or how anybody else is causing their problems. I see very little trash.
Maybe it has gotten better recently. I tried and tried with Bluesky, but it would not abide.
It was bad the week Trump got elected, it’s gotten better since then.
The uncomfortable truth to most "the algorithm is biased" takes is that we humans are far more politically biased than the algorithms and we're probably 90% to blame.
I'm not saying there is no algorithmic bias, and I tend to agree the X algorithm has a slight conservative bias, but for the most part the owners of these sites care more about keeping your attention than trying to get you to vote a certain way. Therefore if you're naturally susceptible to cultural war stuff, and this is what grabs your attention, it's likely the algorithm will feed it.
But this is far more broad problem. These are the types of people who might have watched political biased cable news in the past, or read politically biased newspapers before that.
the issue brought up in the article isn't that "the algorithm is biased" but that "the algorithm causes bias". A feed could perfectly alternate between position A and position B and show no bias at all, but still select more incendiary content on topic A and drive bias towards or away from it.
I have the same thought, my X algo has become less political than HackerNews. I suppose it depends on how you use it but my feed is entirely technical blogs, memes, and city planning/construction content
And this is why the price for Twitter was, in the end, remarkably low.
Yeah, this was always the play looking back in hindsight. Like, I didn't get it, "why would you pay that kind of money for a web forum?!" It wasn't the forum that was important, Twitter (for better or worse) has wormed it's way into the fabric of American discourse. He was basically buying the ideological thermostat for the country and turning the dial to the right.
This is even worse outside of NA. In many countries it is the de facto communication channel of government and businesses.
Or from the other perspective: Meta and Google have had their finger on the scale for more than a decade (along with old twitter).
In twitters case, you had regime officials directing censorship illegally through open emails and meetings.
It's no surprise that the needle moves right when you dial back the suppression of free expression even a little bit (X still censors plenty)
How is it illegal? It is their platform to do what they want with it. You can disagree and not use it, but it is theirs to do with as they see fit. If this was a government run operation paid for with tax dollars, then it would be an issue.
I think that, when you cross a certain level, you ought to be held responsible for the influence you exert on society. All political power needs to come from the popular vote, through fair elections.
as opposed to the government funding turning it to the reality bending left? There was direct communication from Senators and members of Congress directing twitter to block and ban based on certain topics. And Twitter obliged.
If you think Elon's hand on the scale isn't pushing an agenda too I have a bridge to sell you. This was always about power.
dark
Why anyone is still using X after 2025 is a mystery (I know, it's where everyone is, but the moral implications are wild)
Seriously. The CEO is opening posting white supremacist content like it's Stormfront. If you don't support that, you should get out.
I don’t know which country you’re in, but in the US Trump won the popular vote. Plenty of people here are perfectly happy with Stormfront.
I think the idea that if you don't support white supremacy you should get off the site owned and run by a clear white supremacist applies regardless of how elections go.
I recommend _Culture in Nazi Germany_ by Michael H Kater. [0] It is very dry but goes into detail of the culture of the era from late 1920s to end of WWII.
One aspect he highlights at the end is that Fascism was not rejected by the current and former citizens, those that migrated, of Germany. In their mind it was incorrectly implemented. A number of Zionist that migrated from Germany to Palestine were supporters of Fascism. It was not until mid to late 1960s when people start realize and admitted Fascism was bad.
I personally will never fund Elon Musk. Anyone that says empathy is bad is a bad person at heart. Empathy is intelligence and those that lack it lack strong intelligence. There is no way to put yourself in the position others have gone through without empathy.
[0] https://academic.oup.com/ahr/article-abstract/128/3/1512/728...
You keep making this claim as some kind of sick attempt at comparing Jews to Nazis. It’s simply not true.
No one was even talking about zionism here, you are just looking for places to push your hateful agenda.
https://www.pewresearch.org/short-reads/2025/06/03/most-peop...
Dude everybody knows what happened to your country. It's actually sad/horrific, not hateful.
I recommend _Culture in Nazi Germany_ by Michel H Kater [0] for the fact it examines the complexity of reality during this time. He actually went and talked to the musicians, actors, and writers of the era to have a better understanding of the culture and view points that the people still held after the war. I will take his expertise over those of the modern era that have not engaged with these people that lived through it.
People scoff at the idea _Jews for Hitler_. Reality is that a number of Jewish Germany actually supported Hitler and Fascism in the 1920-1930s and latter. It latches onto the idea that that modern day people would be able to pick out Fascism and reject it ... which has shown to be the opposite with populism.
By the way, I consumed _Mein Kampf_ by Adolf Hilter. It was not to align with his ideology but to understand it and have an independent reference to _like Hilter_ that politicians, the media, and pop culture use, and as a base understand of Fascism. I in fact reject Fascism, Nazism, and Adolf Hitler.
Example is that Hilter's take over of Europe was under the disguise that resources are ours for the taking and are needed to support the country. He states this in his book. This is the same statement that Donald Trump and his executive branch uses against Greenland. Yet history proved Hilter wrong. Technology is what drives the economy not direct resource access. Japan, USA, South Korea proved this after WWII.
[0] https://www.cambridge.org/core/journals/central-european-his...
[1] https://en.wikipedia.org/wiki/Michael_Hans_Kater
P.S. Based on your post history your account is shell account with only two comments that only drive pro Israel ideology.
Less than you might think.
He didn't win a majority of the vote, just a plurality. And less than 2 of 3 eligible voters actually voted. So he got about 30% of the eligible population to vote for "yay grievance hate politics!" Which is way more than it should be, but a relatively small minority compared to the voter response after all ambiguity about the hate disappeared. This is why there's been a 20+ point swing in special election outcomes since Trump started implementing all the incompetent corrupt racist asshatery.
"If everyone had voted, Trump still would have won" (by an even wider margin)
https://www.npr.org/2025/06/26/nx-s1-5447450/trump-2024-elec...
A 2025 study... Asking people if they "would have" voted for the winner of the election, a corrupt vindictive racist asshat already in power? Well, I guess that's one way to conduct a study. Fortunately the shift in sentiment is clear, growing, and reflected in special elections.
Your theory is that people who didn't care enough to vote are concerned that Donald Trump is going to come after them if they don't say they would have voted for him, when surveyed anonymously?
And then NPR was duped into credulously reporting on this polling?
That is quite a theory.
I'm saying it doesn't take much for someone to say, "yeah, I would have voted for the guy already in power". I'm surprised it wasn't much higher than that.
So no, you definitely misrepresented my theory. It doesn't take a specific threat of violence for someone to say "sure, I would have cast a vote with the winner." And yet it was only ~1.5% higher than before the election. Are you saying you don't even recognize the bias of saying "yeah, I'm good with the winner"? Or the bias of a honeymoon period? I mean, June 2025 was before 90% of his craziest shit. But you go on.
Oh sorry, you made it sound like "corrupt" and "vindictive" were somehow relevant to the polling results.
The media seemed pretty surprised by the results, which indicates that your hypothesis is perhaps not accurate. But hey, keep doubling down, moving the goalposts, etc. I'll leave you to it.
Nah, just an observation. Or my hypothesis is accurate and they were just taking it at face value like you apparently did (assuming you are posting in good faith). The click bait appeal couldn't have hurt (although I agree with your expectation that they don't usually go for that). But dippy did pull their funding after all.
My goalposts never moved. Sorry you misinterpreted a few accurate adjectives.
Have a great evening!
I didn't get it either until I trained the algorithm to feed me what I want by just clicking the three dots and selecting Not Interested on anything I never wanted to see again... it listens, whats left is really unmatched anywhere, I've really looked, and occasionally still do out of curiosity.
Lots of info is shared there first. It shows up in news articles and podcasts 12-24 hours later. Not everything shared there is true, of course, so one has to do diligence. But it definitely surfaces content that wouldn't show up if I just read the top 2-3 news websites.
Live update for sport events. People post highlights and replays before anyone else.
What does it mean to have someone on a chronological feed, versus the algorithmic one? Does that mean a chronological feed of the accounts they follow? I hardly ever use that, since I don't follow many people, and some people I follow post about lots of stuff I don't care about
from the study:
> We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks
I don't follow people who post about lots of stuff I don't care about. I follow hashtags instead, which gives me a much higher signal to noise ratio than following those people.
I thought hashtags were dead. Does this surface tweets that just have the word, not the hashtag? Can you follow multiple words using boolean operators?
Perhaps it depends on the platform. Hashtags work great on Mastodon.
Some Mastodon clients allow following multiple words using boolean operators although the amount of support may vary depending on which client you use:
eg.: #hashtag1 AND #hashtag2
"We need more funding into open protocols that decentralize algorithmic ownership; open platforms that give users a choice of algorithm and platform provider; and algorithmic transparency across our information ecosystem."
This sounds like a call to separate the aggregation step from the content. Reasonable enough, but does it really address the root cause? Aren't we just as polarized in a world where there are dozens of aggregators into the same data and everyone picks the one that most indulges their specific predilections for engagement, rage, and clicks?
What does "open" really buy you in this space?
Don't get me wrong, I want this figured out too, and maybe this is a helpful first step on the way to other things, but I'm not quite seeing how it plays out.
I’d hope people wouldn’t intentionally pick the political extremism feed if they had any other option (although it’s hard to say).
I will note that political extremists can have more interesting content at times, and it’s good to see what they are up to in case it will affect you. They also sometimes surface legitimate stories that are kept out of the mainstream press, or which are heavily editorialized or minimized there. But you definitely have to view all sides to gain an accurate view here, it would be a mistake to read only one group of extremists. And it’s almost always a mistake to engage with any of them.
From where I'm sitting, it seems obvious people do exactly that.
"It's the most interesting one!"
For a related example I was talking with a colleague recently about how we had both (independently) purchased Nebula subscriptions in an effort to avoid getting YouTube premium and giving Google more money, but both felt the pull back to YouTube because it is so good at leveraging years of subscription and watch history to populate the landing page with content we find engaging.
If even two relatively thoughtful individuals choosing to spend money on a platform with the kind of content they'd like to choose to watch can't seem to succeed at beating an engagement-first algorithm, I'm not sure how much hope normies would have, unless it's the real issue is just being terminally online period, and the only way to win is simply not to play.
Underrated in X's changes is how blue checkmark users are shown first underneath popular tweets. Most people who pay for blue checkmarks are either sympathetic to Musk's ideology or indifferent. Many blue checkmark users are there to make money from engagement.
The result is underneath any tweet that gets traction you will see countless blue checkmark users either saying something trolling for their side or engagement-baiting.
The people who are more ideologically neutral or not aligned with Musk are completely drowned out below the hundreds of bulk replies of blue checkmarks.
It used to be that if you saw someone, like a tech CEO, take an interesting position you'd have a varied and interesting discussion in the replies. The algorithm would show you replies in particular from people you follow, and often you'd see some productive exchange that actually mattered. Now it's like entirely drivel and you have to scroll through rage bait and engagement slop before getting to the crumbs of meaningful exchange.
It has had a chilling effect on productive intellectual conversation while also accelerating the polarization of the platform by scaring away many people who care about measured conversation.
I automatically tune out any blue checkmark post or reply and just assume it's an LLM responding to earn $.003
It is interesting to see a general bias taken away from the study, which I wouldn't necessarily guess given my own experience. My X "For You" feed mostly does not read pro-Trump - instead mostly pushing very intense pro-European and pro-Canadian economic and political separation from the USA, and pushing very negative narratives of the USA, although I suppose it occasionally also introduces pro-Trump posts, and perhaps those do not sway me in the same way given I am a progressive American.
That said, the Trending tab does tend to push very heavy MAGA-aligned narrative, in a way that to me just seems comical, but I suppose there must be people that genuinely take it at face value, and maybe that does push people.
Less to do with the article:
The more I think about it, I'm not really even sure why I use X these days, other than the fact that I don't really have much of an in-person social life outside of work. Sometimes it can be enjoyable, but honestly the main takeaway I have is that microblogging as a format is genuinely terrible, and X in particular does seem to just feed the most angry things possible. Maybe it's exciting to try and discuss opinions but it is also simultaneously hardly possible to have a nuanced or careful discussion when you have limited characters, and someone on the other end that just wants to shout over you.
I miss being a kid and going onto some forums like for Scratch or Minecraft or whatever. The internet felt way more fun when it was just making cool things and chatting with people about it. I think the USA sort of felt more that way too, but it's hard to know if that was just my privilege. When I write about X, it uncomfortably parallels to how I would consider how my interactions have evolved with my family and friends in real life.
There was a great study from a decade ago that showed that baseball cards held by lighter skinned hands outsold cards held by darker skinned individuals on eBay.
An algorithm designed today with the goal of helping users pick the most profitable product photo would probably steer people towards using caucasian models, and because eBay's cut is a percentage, they would be incentivized to use it.
Studies show that conservatives tend to respond more positively to sponsored content. If this is true, algorithm-driven ad-sponsored social sites will tend towards conservative content.
https://onlinelibrary.wiley.com/doi/abs/10.1111/1756-2171.12...
https://www.tandfonline.com/doi/full/10.1080/00913367.2024.2...
It couldn’t be possible for a social media feed to influence users in the direction of issues important to the Democratic Party, could it?
Or would that just be considered an unalloyed good?
I honestly don't understand how or why people are using Twitter to keep up with the news. The only thing I use it for is to follow artists, and even that has been going down in recent weeks with most of my favorites moving over to BlueSky. Maybe I'm just a long-winded idiot, but the character limits barely let me have a conversation on either platform. How are people consuming news like this?
It just baffles me how different my experience of using the platform is. I literally do not see any news. I'm not entirely convinced that it's Twitter being biased and not just giving each person what they most engage with.
You follow artists, and they are not tweeting their political opinions? Cool.
I gave up on Twitter when everyone I followed kept adding politics. Even if I agreed with it, I just don't want to a marinade in the anger all day.
The one exception I think of is the guy from Technology Connections, who I stopped following because I got tired of seeing him in my feed complaining about something or other. And I've noticed he's been putting that into his videos as well, so I might have to do it on YouTube as well.
My feed (UK based) seems to give me the major news stories well before the mainstream (BBC), and I'm taking days if not weeks in some cases. Now it could be that's how the mainstream decides to cover a particular story? What's worrying is when a story is all over X but isn't covered.
To give an example, the recent protests in Iran where being covered on X but the BBC was silent for weeks before finally covering the story (for a few days).
Could it also be that "mainstream" news are actually trying to verify information and/or obtain confirmation from other sources? All of that is done in an attempt to avoid promoting false information. People tweeting do not do any of that
Could be. Of course, the UK press can also be subject to “D-Notices” that prevent publication of certain sensitive stories. In cases where a story may be subject to a D-Notice, your only hope for information is to go to alternative or foreign media.
It used to be self-expression in an oddly entertaining way but that Nikita Bier ruined the whole thing with his metrics chasing algorithmic shifts.
>I honestly don't understand how or why people are using Twitter to keep up with the news.
Because the MSM news stations themselves pick up the stuff from twitter and just add their own spin flavor. A dozen phone videos from random citizens on-site is always quicker than the time CNN/FOX can send a reporter there. On twitter you at least get the raw footage and can judge for yourself before MSM try to turn it political to rage bait you.
Theres much more diversity of thought on the right, did they get more open minded?
You have to find good people. Bad people will find you.
What a garbage short and shallow take. They are all like that. And it even started with news networks (90s CNN in particular). BlueSky, reddit, Tik Tok, and even YouTube, they all have horrible userbases and algorithmic tendencies towards doomscrolling. Lots of bot accounts for the past 10 years. And now hyper-powered by AI content.
You can tune some of them by heavily curating what you follow and blocking a lot. But it's an uphill battle.
The only winning move is not to play. How about a nice game of chess?
I really wish these points were made in a non-political / platform-specific way because if you care about this issue it's ultimately unhelpful to frame it as if this is an issue with just X or conservatives given how politically divided people are.
I do share the author's concerns and was also concerned back in the day when Twitter was quite literally banning people for posting the wrong opinions there. But it's interesting how the people who used to complain about political bias, now seem to not care, and the people who argued "Twitter is a private company they can do what they want" suddenly think the conservative leaning algorithm now on X is a problem. It's hard to get people across political lines to agree when we do this.
In my opinion there two issues here, neither are politically partisan.
The first is that we humans are flawed and algorithms can use our flaws against us. I've repeatedly spoken about how much I love YouTube's algorithm because despite some people saying it's an echo chamber, I think it's one of the few recommendation algorithms which will serves you a genuinely diverse range of content. But I suspect that's because I genuinely like consuming a very wide range of political content and I know I'm in a minority there (probably because I'm interested in politics as a meta subject, but don't have strong political opinions myself). But my point is these algorithms can work really well if you genuinely want to watch a diverse range of political content.
Secondly some recommendation algorithms (and search algorithms) seem to be genuinely biased which I'd argue isn't a problem itself (they are private companies and can do what they want), but that bias isn't transparent. X very clearly has a conservative bias and Bluesky also very clearly has political bias. Neither would admit their bias so people incorrectly assume they're being served something which is fairly representative of public opinion rather than curated – either by moderation or algorithm tweaks.
What we need is honesty, both from individuals who are themselves seeking out their own bias, and platforms which pretend to not have bias but do, and therefore influence where people believe the center ground is.
We can all be more honest with ourselves. If you exclusively use X or Bluesky, it's worth asking why that is, especially if you're engaging with political content on these platforms. But secondly I think we do need more regulation around the transparency of algorithms. I don't necessary think it's a problem if some platform recommends certain content above other content, or has some algorithm to ban users posts content they don't like, but these decisions should be far transparent than they are today so people are at least able to feed that into how they perceive the neutrality of the content they're consuming.
I blame Google for a lot of this. Why? Because they more than anyone else succedded in spreading the propaganda that "the algorithm" was like some unbiased even all-knowing black box with no human influence whatsoever. They did this for obvious self-serving reasons to defend how Google properties ranked in search results.
But now people seem to think newsfeeds, which increase the influence of "the algorithm", are just a result of engagement and (IMHO) nothing could be further from the truth.
Factually accurate and provable statements get labelled "misinformation" (either by human intervention or by other AI systems ostensibly created to fight misinformation) and thus get lower distribution. All while conspiracy theories get broad distribution.
Even ignoring "misinformation", certain platforms will label some content as "political" and other content as not when a "political" label often comes down to whether or not you agree with it.
One of the most laughable incidents of putting a thumb on the scale was when Grok started complaining about white genocide in South Africa in completely unrelated posts [1].
I predict a coming showdown over Section 230 about all this. Briefly, S230 establishes a distinction between being a publisher (eg a newspaper) and a platform (eg Twitter) and gave broad immunity from prosecution for the platform for user-generated content. This was, at the time (the 1990s), a good thing.
But now we have a third option: social media platforms have become de facto publishers while pretending to be platforms. How? Ranking algorithms, recommendations and newsfeeds.
Think about it this way: imagine you had a million people in an auditorium and you were taking audience questions. What if you only selected questions that were supportive of the government or a particular policy? Are you really a platform? Or are you selecting user questions to pretend something has broad consensus or to push a message compatible with the views of the "platform's" owner?
My stance is that if you, as a platform, actively suppresses and promotoes content based on politics (as IMHO they all do), you are a publisher not a platform in the Section 230 sense.
[1]: https://www.theguardian.com/technology/2025/may/14/elon-musk...
[dead]
Comment was deleted :(
Oh my. Now that X is affecting people's politics (for the better IMO), suddenly people care about the influence of algorithms over politics...
I am shocked, shocked to find that there is social engineering in this establishment!
yep but you wont find common sense on the matter here unfortunately
This person is confused. Trump was a well known pussy grabber for decades. Epstein was anything but a secret, it seems, given how many politicians and celebrities and moguls he rubbed elbows with. Jerry stopping by the island for a lemonade and a spot of lunch with his high school aged girls? Yeah.
It comes down to this: you can have visibility into things and yet those in power won’t care whatsoever what you may think. That has always been the case, it is the case now, and will continue to be in the future.
This is a defeatist attitude. Don't know what bubble you're in but these official revelations are driving real change in mine. It's kind of subtle at this point but it's the kind of change that cannot be undone.
Unfortunately speed often matters when it comes to outcomes. Eg if you get a cancer diagnosis like Jobs, you probably shouldn’t waste a year drinking juices and doing acupuncture.
Crafted by Rajat
Source Code