Big tech has built machines optimized for one thing: keeping people scrolling. The algorithms don’t care what keeps you scrolling. It could be puppy videos or conspiracy theories about election fraud. They only care that you keep consuming. And it turns out, nothing keeps people engaged quite like rage.
The executives at these companies will tell you they’re neutral platforms, that they don’t choose what content gets seen. This is a lie. Every algorithmic recommendation is an editorial decision. When YouTube’s algorithm suggests increasingly extreme political content to keep someone watching, that’s editorial. When Facebook’s algorithm amplifies posts that generate angry reactions, that’s editorial. When Twitter’s trending algorithms surface conspiracy theories, that’s editorial…
Big tech has built machines optimized for one thing: keeping people scrolling. The algorithms don’t care what keeps you scrolling. It could be puppy videos or conspiracy theories about election fraud. They only care that you keep consuming. And it turns out, nothing keeps people engaged quite like rage.
The executives at these companies will tell you they’re neutral platforms, that they don’t choose what content gets seen. This is a lie. Every algorithmic recommendation is an editorial decision. When YouTube’s algorithm suggests increasingly extreme political content to keep someone watching, that’s editorial. When Facebook’s algorithm amplifies posts that generate angry reactions, that’s editorial. When Twitter’s trending algorithms surface conspiracy theories, that’s editorial.
They are publishers. They have always been publishers. They just don’t want the responsibility that comes with being publishers.
For years, these companies have hidden behind Section 230 protections while operating more like media companies than neutral platforms. They’ve used recommendation algorithms to actively shape what billions of people see every day, then claimed they bear no responsibility for the consequences. It’s like a newspaper publisher claiming they’re not responsible for what appears on their front page because they didn’t write the articles themselves.
We need to be honest about what these algorithms are doing to our democracy. They’re not just amplifying existing divisions, they’re creating new ones. They’re not just reflecting polarization, they’re manufacturing it. Every time someone opens one of these apps, they’re being shown content specifically chosen to provoke an emotional response. That’s not neutral. That’s manipulation.
This isn’t a technology problem. This is a business and choice problem. These companies could change their algorithms tomorrow to prioritize accuracy over engagement, community over conflict, human wellbeing over profit. They choose not to because extremism is more profitable than moderation.
The solution isn’t to ask nicely for these companies to do better. We tried that. The solution isn’t to hope users will abandon these platforms en masse. That won’t happen as long as the network effects keep people trapped.
The solution is regulation. Real regulation. Not the performative theater we’ve seen in congressional hearings, but actual laws with actual consequences.
We need algorithmic transparency. These companies should be required to disclose how their recommendation systems work and what content they’re amplifying.
We need algorithmic accountability. When an algorithm recommends content that leads to violence, there should be consequences. And we need algorithmic choice. Users should have the right to see chronological feeds, not just algorithmically curated ones designed to manipulate their emotions.
Most importantly, we need to end the liability shield these companies hide behind. If you’re going to operate as a publisher, making editorial decisions about what content gets amplified, then you should face the same legal responsibilities as any other publisher.
Turn off the internet. Or fix it. Those are the only choices we have left. The time for hoping these companies will self-regulate is over. The time for treating algorithmic manipulation as an inevitable part of modern life is over. We know what these systems do. We know who they hurt. The only question left is whether we’re going to do something about it.