This is a rush transcript. Copy may not be in its final form.
AMY GOODMAN: This is Democracy Now!, democracynow.org. I’m Amy Goodman.
We end today’s show with the acclaimed writer and tech activist Cory Doctorow. Three years ago, he coined a term to describe how online platforms, like Facebook, degrade over time as platform owners seek to maximize profit. The phrase Doctorow came up with went viral, but due to FCC rules, we can’t say it on the air. We’ll just call it “en[bleep]ification.” The “bleep” is covering a word that rhymes with “hit.” It’s also the title of Cory Doctorow’s new book.
Cory, welcome to Democracy Now! The book title, En[bleep]ification: Why Everything Suddenly Got Worse and What to Do About It. Why don’t you start from the beginning? What…
This is a rush transcript. Copy may not be in its final form.
AMY GOODMAN: This is Democracy Now!, democracynow.org. I’m Amy Goodman.
We end today’s show with the acclaimed writer and tech activist Cory Doctorow. Three years ago, he coined a term to describe how online platforms, like Facebook, degrade over time as platform owners seek to maximize profit. The phrase Doctorow came up with went viral, but due to FCC rules, we can’t say it on the air. We’ll just call it “en[bleep]ification.” The “bleep” is covering a word that rhymes with “hit.” It’s also the title of Cory Doctorow’s new book.
Cory, welcome to Democracy Now! The book title, En[bleep]ification: Why Everything Suddenly Got Worse and What to Do About It. Why don’t you start from the beginning? What do you mean by this term?
CORY DOCTOROW: So, it’s a way to talk about how the platforms go bad, but it’s also a way to talk about why the platforms go bad. So, we can see this characteristic pattern of decay in platforms, where first they’re good to their end users, they find a way to lock those users in, and once it’s hard for the users to leave, they can make things worse for those end users to tempt in business customers. Those business customers also get locked to the platform, and so then things are made bad for them, too, because they can’t readily depart. And eventually, you end up with a platform where all the value has been harvested by shareholders and executives, and there’s just kind of a mingy, homeopathic residue of value left behind for the platform users. And yet, because of these broader factors, that are about the social critique, the political and economic critique, it’s hard for us to leave these platforms, and they sort of shamble on.
AMY GOODMAN: So, tell us how it works. Take one example, whether —
CORY DOCTOROW: Sure.
AMY GOODMAN: — you’re talking about Google, whether you’re talking about Twitter, whether you’re talking about Facebook. And talk about how they started and where they’re headed.
CORY DOCTOROW: I think Facebook’s the canonical case, right? So, Mark Zuckerberg, 2006, wants to open up beyond American college kids. You don’t need a .edu address anymore. So, he says, “Come to Facebook. We’re the platform that, unlike MySpace, will never spy on you, and we’re only going to show you the things that you ask to see. So, if you subscribe to some people, that’s what’s going to be in your feed.” And so, people pile in, and they lock themselves into through something called the collective action problem, which is just a way of saying, you know, you love your friends, but they’re a pain in the butt, and even though you all want to hang out and play a board game this weekend, you can’t agree on what board game to play, much less, even though you all agree that you hate Facebook, you can’t agree on when it’s time to leave or where to go or how to reestablish yourself, so you get stuck there.
And once you’re stuck there, Mark Zuckerberg starts to make things worse for you, to make things better for business customers. So, they go to the advertisers, and they say, “You remember we told these rubes we weren’t going to spy on them? Total lie. We spy on them with every hour that God sends. Give us small dollars. We will target ads to those people with just incredible fidelity, and we’ll spend as much money as it takes to stop ad fraud. So, you give us a dollar to show an ad to a user. That user is going to see that ad.” And publishers get a similar deal. You know, “We told these users we wouldn’t stick stuff in their feed they didn’t ask to see, but we’ll make an exception for you. Put stuff on Facebook from your website, little excerpts, link back to your own website. We’ll just cram it into the eyeballs, if people never asked to see it. Some of them will click the link. You’ll get to monetize that traffic.” And so, they become locked in, too.
They become dependent on those users, who are dependent on each other, and now ad prices go up, ad targeting fidelity goes down, ad fraud explodes. Procter & Gamble used to spend $200 million a year on surveillance ads. In 2017, they took that to zero and saw a $0 drop in sales, because all those ads were just disappearing down the fraud hole. Meanwhile, publishers are finding they have to put more and more of their content onto Facebook in order to just be shown to their own subscribers. So they have to substitute for their own website. And even, you know, you put a link in the bottom of your page, they’re not even going to show that to anyone, because maybe the link’s a malicious link.
So, now you have all the value harvested for Facebook. And they’re in this very brittle equilibrium, because the difference between “I hate this place, but I can’t stop coming to it” and “I hate this place, I’m not coming back,” it’s very brittle. You take one scandal, one live-streamed mass shooting, people bolt for the exits, and then Facebook panics. And being tech bros, they call it “pivoting.” And so, one day, you know, Mark Zuckerberg arises from his sarcophagus, and he says, you know, “Hearken to me, brothers and sisters, for I’ve had a vision. I know I told you that the future would consist of arguing with your racist uncle using this primitive text interface I made in my dorm room to nonconsensually rate the bangability of my fellow undergraduates, but, actually, I’m going to transform you and everyone you love into a legless, sexless, low-polygon, heavily surveilled cartoon character, so I can imprison you in a virtual world I stole from a 25-year-old satirical cyberpunk novel that I call the metaverse,” right? And that is the final stage of en[bleep]ification. The platform is a giant pile of bleep.
AMY GOODMAN: So, let’s go to what you wrote about President Trump and the platform owners. In the United States, you write, at the 2025 inauguration, Trump spoke from within a “decorative semicircle of tech billionaires”: Meta CEO Mark Zuckerberg, former Amazon CEO Jeff Bezos, Google CEO Sundar Pichai, Apple CEO Tim Cook, TikTok CEO Shou Zi Chew and, of course, Elon Musk. “These men intervened in many ways on Trump’s behalf.” So, if you can talk about President Trump, his circle of platform owners. You talk about Donald Trump’s election representing the ultimate triumph of en[bleep]ification of the political realm.
CORY DOCTOROW: So, you know, this is a theory about not just why the platforms are bad now, but you have to interrogate why they were better before. What stopped them from going bad? Because we didn’t invent greed in like 2017. So, what was it that stopped them from going bad?
My theory is they had constraints. They had to worry about competitors. But then we let them buy all their competitors. They had to worry about regulators. But when you boil a company down to or an industry down to like five giant websites filled with screenshots of text from the other four, they have so much profits from not competing, and they find it so easy to agree, that they capture regulators.
They also had to worry about their workers, because tech workers have been so powerful. Even though they weren’t unionized, they were in enormous demand, super productive. There’s a National Bureau of Economic Research paper that estimates that the average Silicon Valley worker was contributing a million dollars a year to the bottom line of their employers. So their employers really valued them. They couldn’t afford to lose them. They couldn’t replace them.
And you take all that away, and you get an environment in which people can do bad things with impunity. They don’t face any consequences.
And Trump, that was his promise: “I’ll neuter the National Labor Relations Board. I’ll get rid of antitrust, unless you make me angry, in which case I’ll do antitrust until you use the $TRUMP coin tip jar on the Resolute Desk to give me some money. And I’m going to let you buy as many of your competitors as you want, and not have to worry about any of these external sources of discipline.” And, you know, even the best of us, without discipline, go horribly wrong. You know, this is the problem of people who can just shout at other people. We all have writers we love who eventually get so big they can tell their editors to go to hell, and then their books get really terrible. But, you know, Howard Hughes, if you gave him some constraints, would make you some pretty cool airplanes. Take away the constraints, that guy starts wearing Kleenex boxes on his feet and saving his urine in jars, right?
So, what these platforms aspire to is their own demise, right? It’s what Trump aspires to, to be in a world in which no one can tell him no and in which every bad idea that kind of materializes in the vacant spaces in his head becomes a policy. And so, you know, enshittification — enpoopification is the collapse of discipline. And America’s ruling class has managed to neutralize all the discipline that it ever faced. Their weirdest, worst ideas are the ones that we’re all stuck with.
AMY GOODMAN: What do you think of the Trump deal that would put his billionaire ally, Larry Ellison, who became the richest man in the world for a few minutes a few weeks ago, in charge of TikTok?
CORY DOCTOROW: I think it shows you that the origin of this phenomenon is not users making bad consumption decisions. It’s not executives being greedy. It’s a policy environment that’s like en[bleep]ogenic, right? That when you have an environment that says you can do bad things and you can get away with it, then it doesn’t really matter who’s running the companies, right? Like, we can say, “Oh, Larry Ellison might be worse than the guy no one’s ever heard of who runs Tiktok,” but is he really going to be worse than him? I mean, TikTok thumbed the scales for right-wing candidates in the last election cycle, and they were able to do that because they operate in total opacity.
We have laws, like the Digital Millennium Copyright Act. This is a Bill Clinton law from 1998. Section 1201 of that law makes it a felony — like, you’ll go to jail for five years and pay a $500,000 fine — for reverse engineering stuff to modify it. So if you wanted to put a different algorithm in your TikTok client, or just stop it from stealing your data while you use it, that’s a crime, right? And so, you know, we could have said — rather than we are going to replace one guy who fills a, you know, tech supervillain-shaped hole with another guy who fills a tech villain-shaped hole in TikTok’s C-suite, we could have said we’re going to empower people, co-ops, nonprofits, startups, and even big commercial rivals, to reverse engineer the TikTok app, to change how the algorithm works, for you, so that you see the things that you want to see, so that it becomes a platform that’s responsive to you.
And just as when we made it legal to block ads on the web — it’s not legal on apps, because you have to reverse engineer them — we instill discipline in companies, who have to worry that if you made the ads too obnoxious on their website, you’d go install an ad blocker. Fifty-one percent of web users have installed an ad blocker. It’s the biggest consumer boycott in human history. And so, they have to think twice before they make it worse.
When you take away that discipline, when you say, “OK, well, you can do whatever you want,” and no one is — can defend themselves, you have the full panoply of all the things you can do with digital tools, where you can change the rules from moment to moment. You can change the recommendation system. You can, at your whim, alter the game. I call it the Darth Vader NBA. I’m altering the deal. Pray I don’t alter it further. And we’re defenseless in front of it. You get the worst of all technological worlds, a world where technology torments you endlessly and never saves you from that torment.
AMY GOODMAN: Talk about Google, how it started and where it is now, and what it’s taking from — I mean, it’s not only hurting consumers, it’s not only hurting people who are trying to communicate, but businesses, as well.
CORY DOCTOROW: Yeah, sure. So, you know, Google is a company that has had one really good idea. It was in the last millennium. They made a really good search engine. Virtually everything they made in house since has crashed and burned. Almost everything they make as a company, they bought from someone else, in violation of antitrust law, in waivers that were — mergers, rather, that were waived through under both Republicans and Democrats. And they pretend they’re, you know, Willy Wonka’s idea factory. They’re just like Rich Uncle Pennybags buying other kids’ toys, right? And they eventually became not just too big to fail, they became too big to care.
In 2019, as we found out in the DOJ’s lawsuit against Google last year, they, having acquired a 90% search market share, stalled out on growth. Obviously, right? How are you going to grow your search when you’ve got a 90% market share? We are already all using it, and we’re already using it for everything. I mean, yes, they could breed a billion humans to maturity and make them Google customers. They called that product Google Classroom. But it takes a minute to mature. And meanwhile, they want growth now, because investors want growth now.
So, in the memos that the DOJ published, we see this epic battle play out for the soul of Google. You have these two characters. One’s this guy Prabhakar Raghavan. He’s this ex-McKinsey guy, came from Yahoo. He’s in charge of search revenue. And he has this idea: What if we make the search worse? What if we turn off all the stuff that tries to guess the best match for your query, so that when you search, you don’t get the best match? You have to search again and maybe again and maybe again. Each time, we get to show you more ads, right? That’s growth.
Now, he’s opposed by this guy called Ben Gomes, who’s, you know, kind of the epitome of an OG googler. He is the guy who built out their server infrastructure. He started with one server under a desk at Stanford, built the global network of data centers, and now he’s in charge of search technology. And in the memos, you can see Gomes is palpably horrified by this idea to make this thing that he’s worked so hard for worse, just for this venal purpose.
And then they have this fight. And the fight plays out in the memos. And basically, Prabhakar Raghavan and his allies’ argument is, “I don’t really care if this makes you feel icky, because it’s going to make us more money. And we will face no consequences.”
Now, it’s not just that Google was shorn of the consequences of making their product worse. They also used to have to really value their workers. But because the workers thought of themselves as temporarily embarrassed entrepreneurs and thought, “Well, we’re being treated really well now. Why do we need a union?” as soon as supply caught up with demand, and you got half a million tech layoffs in the last three years, the workers lost the power to discipline them, too.
AMY GOODMAN: So, in this last minute, though we’re going to do a Part 2 —
CORY DOCTOROW: Sure.
AMY GOODMAN: — is there a breaking point beyond which even locked-in users rebel against these tech platforms?
CORY DOCTOROW: Yeah.
AMY GOODMAN: Where do you see the fightback?
CORY DOCTOROW: So, there is a breaking point. I don’t think we want to reach it, right? There’s a breaking point where people who live in the urban-wildlife interface are like, “Oh, my house keeps burning down every year. I guess I’m going to leave now that everything I own has been destroyed.” Ideally, you’d want those people to leave beforehand.
So, people are on these platforms because the platforms do good things for them. People are on Facebook because it’s where the people with the same rare disease as them meet, or it’s where the people in the country they emigrated away from meet, or it’s where their affinity group meets. And we don’t want those things to be destroyed by a forced departure. We want to evacuate the platforms. And I think that in the European Union there’s this effort underway to create, like, portability, so if you leave Facebook or Twitter, you might be able to go to another platform — maybe that’s Bluesky, maybe it’s Mastodon, maybe it’s something that hasn’t been invented yet — and still stay in touch with the people you left behind, so you don’t have to choose between the people you love and the platform you hate.
AMY GOODMAN: So, we’re going to talk about Bluesky, Mastodon and much more in Part 2. But right now we have to wrap up the show. Cory Doctorow is a science-fiction author, activist, journalist. He works for the Electronic Frontier Foundation. His new book is, OK, well, we’ll say, En[bleep]ification: Why Everything Suddenly Got Worse and What to Do About It.
That does it for our show. I’ll be on the road. On October 17th, I’ll be in Santa Fe at the Santa Fe Film Festival, where the new film, Steal This Story, Please!, is going to be playing, and I’ll be doing a Q&A after. On October 18th, I’ll be in Woodstock; 19th, Saugerties. I’m Amy Goodman. Check our website, democracynow.org.
The original content of this program is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Please attribute legal copies of this work to democracynow.org. Some of the work(s) that this program incorporates, however, may be separately licensed. For further information or additional permissions, contact us.