The New Yorker Radio Hour

It’s Not Just You: The Internet Is Actually Getting Worse

22 min
Oct 28, 20256 months ago
Listen to Episode
Summary

Corey Doctorow discusses 'enshittification'—a three-stage process where digital platforms initially attract users, then monetize business customers, and finally degrade for everyone to maximize shareholder value. The episode explores how Google deliberately worsened search results, how antitrust action globally is pushing back against platform monopolies, and why privacy regulation is key to breaking exploitative business models.

Insights
  • Platform degradation is a deliberate choice, not inevitable: Google's internal memos reveal intentional decisions to worsen search to show more ads, proving bad products result from business strategy rather than technical limitations.
  • Network effects and legal barriers create lock-in: IP law expansion (DMCA, patents, trademarks) prevents users from switching platforms the way they once migrated from MySpace to Facebook, entrenching monopoly power.
  • Global antitrust momentum is real and cross-partisan: Regulation is advancing in EU, UK, Canada, Australia, South Korea, Japan, and China—not just under Biden—suggesting genuine political will to constrain platform power.
  • Interoperability and privacy law are more effective than breaking up platforms: EU's Digital Markets Act and privacy regulations force better user experiences without requiring corporate dissolution.
  • AI is not yet enshittified because it was never good: Unlike platforms that degrade over time, generative AI is a flashy demo being oversold; the real risk is opaque cost structures and automated query routing that hide true pricing.
Trends
Global regulatory convergence on platform regulation: EU, UK, Canada, Australia, and Asia-Pacific nations are independently implementing strict interoperability and privacy rules that force American tech companies to comply globally.Antitrust enforcement becoming bipartisan and cross-national: Momentum against tech monopolies transcends political parties and national borders, suggesting structural concerns about concentrated power rather than partisan politics.Privacy-first regulation as the lever to break surveillance capitalism: Multiple jurisdictions moving toward comprehensive privacy laws (unlike US, which hasn't updated since 1988) as the most effective way to disrupt exploitative platform models.Interoperability as alternative to breakup: Regulators favoring forced API access and compatibility standards over corporate dissolution, allowing platforms to exist but preventing lock-in.Search alternatives emerging as Google degrades: Users willing to pay for better search (Kagi, Perplexity) when incumbents deliberately worsen results, proving demand for quality exists even in concentrated markets.AI cost opacity as emerging enshittification vector: Automated query routing and hidden pricing structures in AI services may replicate platform degradation patterns before users realize true costs.Contextual advertising as viable alternative to surveillance: Evidence that advertising-supported models don't require invasive data collection if regulation prevents it, challenging assumptions about ad-tech necessity.
Topics
Enshittification: Three-stage platform degradation modelGoogle Search antitrust case and deliberate product degradationDigital Millennium Copyright Act (DMCA) Section 1201 as barrier to competitionEU Digital Markets Act and interoperability requirementsPlatform lock-in and collective action problemsPrivacy regulation as antitrust alternativeGenerative AI pricing and cost opacitySearch engine alternatives (Kagi, Perplexity, Cagi)Apple USB-C mandate and regulatory complianceSurveillance advertising vs. contextual advertising modelsMySpace-to-Facebook migration and data portabilityAntitrust enforcement across EU, UK, Canada, Australia, AsiaNetwork effects and switching costs in tech platformsE-waste and proprietary hardware standardsAI worker replacement claims and limitations
Companies
Google
Central case study: internal memos revealed deliberate strategy to worsen search results to increase ad impressions d...
Facebook
Example of platform enshittification: initially attracted users with privacy promises, then degraded experience after...
Apple
Paid $20B annually to remain default search; threatened to leave EU over Digital Markets Act interoperability require...
OpenAI
Generative search alternative benefiting from Google Search degradation; using opaque query routing and hidden pricin...
Uber
Example of platform lock-in strategy: eliminated competing cab companies to create user dependence.
MySpace
Historical example of platform that lost dominance to Facebook when users could port data via scrapers before IP law ...
Kagi
Paid search alternative ($10/month) gaining users by delivering Google-quality results using reorganized Google index...
Perplexity
AI-powered search alternative attracting users from Google due to faster, more convenient results despite lower accur...
Cagi
Search engine praised for returning to Google's original quality by reorganizing Google's index without surveillance ...
Microsoft
Browser (Edge) is rebadged Chrome, exemplifying how Microsoft lost browser competition and became dependent on Google...
Electronic Frontier Foundation
Civil liberties organization where Doctorow served as European director; advocates for privacy-first regulation and i...
Boing Boing
Early advertising-supported blog co-owned by Doctorow; demonstrates advertising model can work without surveillance i...
People
Corey Doctorow
Coined term 'enshittification'; discusses three-stage platform degradation model and regulatory solutions to tech mon...
Kyle Chaka
Interviewer; wrote about enshittification; hosts conversation exploring platform degradation and antitrust solutions.
David Remnick
Episode host; introduces enshittification concept and frames discussion of internet degradation.
Prabhugar Raghavan
Former McKinsey consultant and Yahoo Search executive; proposed deliberately worsening Google Search to increase ad i...
Ben Gomes
Early Google engineer; opposed Raghavan's plan to degrade search, arguing it violated Google's original mission.
Mark Zuckerberg
Recruited MySpace users with privacy promises, then degraded platform after achieving lock-in; example of enshittific...
Elon Musk
Example of platform executive extracting maximum value from locked-in users in final enshittification stage.
Tim Cook
Opposed EU Digital Markets Act interoperability requirements; company threatened to leave EU over regulation.
Rupert Murdoch
Owned MySpace; characterized as 'evil crapulence' spying on users, motivating migration to Facebook.
Justin Trudeau
Whipped party to pass muscular antitrust law despite not being enemy of corporate power, showing regulatory momentum.
Joe Biden
Cornered into antitrust action by party coalition opposing concentrated power; Google case started under Trump.
Donald Trump
Initiated Google antitrust case in 2019; now attempting to stop antitrust enforcement despite earlier action.
Jason Kevlar
Wrote article about Kagi search engine; noted it reorganizes Google's index better than Google itself.
Quotes
"When you describe something that is all around but that is sort of so diffuse that you can't really put your finger on it, when you describe it you kind of attach a handle to it. And you give people a way to carry it around."
Corey DoctorowEarly in interview
"Google had spent many years bribing Apple to the tune of $20 billion a year not to make a competing search engine... and so they could make it worse and it wouldn't matter. Because people had nowhere else to go."
Corey DoctorowDiscussing Google's market dominance
"It's like Google used to be when Google started... It's 10 bucks a month. I tried it for like 10 seconds and I'm like, okay, I'm buying it."
Corey DoctorowOn Kagi search engine
"The core good product is in there somewhere. We're just not allowed to access it because of all these layers of tweaking and mediation."
Corey DoctorowOn Google Search degradation
"If we could just make it illegal to spy on people, we could solve so many problems. Like that would break the whole model that's going on right now."
Corey DoctorowOn privacy regulation as solution
Full Transcript
Papajones is serving up the tastiest deals on the freshest pizza, all made with the juiciest vine-ripened Spanish tomatoes. Fresh dough never frozen, made from just six clean and simple ingredients. Hot and tasty, dip it non-stop. Order now and get a delicious pizza from just 10 pounds! Papajones. Better ingredients, better pizza. Selected original pizzas, exclusive additional toppings in 6th of April 26th, minimum delivery free 99p participating stores terms apply. This is the New Yorker Radio Hour, a co-production of WNYC Studios and The New Yorker. This is the New Yorker Radio Hour, I'm David Remnick. Sometimes a term is so apt, it's meaning so clear and so relevant to our circumstances that it becomes more than just a useful buzzword and grows to define an entire moment. I'm quoting the New Yorker's technology columnist Kyle Chaka. The term he's describing, well it's so evocative that unfortunately we can't say it on the air. I immediately grabbed onto it, I knew what it meant, I could apply it in my own experience because everything just seems to be getting worse all around us, on our phones and on websites. The term is insidification and it was coined by Corey Doctorow. I've been following Corey Doctorow's work for years and years on the internet, he's someone who I always look to to understand what's going on online and how the latest tech policy is changing and how things work. And when I saw him starting to use this word, insidification, for everything getting worse, I just immediately understood what he was talking about, I think as we all do. Insidification was a word of the year in 2023 and now it's the title of Corey Doctorow's new book. Doctorow is a prolific and respected tech blogger and he writes science fiction too. He also played a big role in the Electronic Frontier Foundation which advocates for civil liberties online. Here's Corey Doctorow speaking with the New Yorker's Kyle Chaka. Well I think that when you describe something that is all around but that is sort of so diffuse that you can't really put your finger on it, when you describe it you kind of attach a handle to it. And you give people a way to carry it around and maybe try and carry it to each other and say are you noticing this? I'm noticing this. I thought I was crazy. I thought it was just me. Maybe we can start with an example that a lot of our listeners will have experienced already. Can you talk about what happened with Google Search? You know the Google DOJ Antitrust trial last year surfaced all these memos about a fight about making Google Search worse. So in 2019 Google had reached maximum search growth. They had a 90% market share in search so they weren't going to get any more users except maybe by like breeding a billion humans to maturity and then making them Google users. They haven't tried that yet. Well no, that's Google Classroom. It just takes a while, right? And so you know as a great at Zetron reported in his newsletter and where's your ad at, you see in the memos this strategy emerged. This guy Prabhugar Raghavan who's an ex-McKinsey guy who'd been at Yahoo overseeing Yahoo Search, who's in charge of Google Search revenue, he says, why don't we make Search worse? Why don't we like get rid of things like spell check or something called query stemming where if you search for trousers it also searches for pants or context clues, right? Like everybody is searching for pants because someone got pantsed on national TV and so when searches per pants you look at like kind of trending queries and you put that at the top. You know, all of that stuff that made it so that you could one shot your search, right? You search in the top result be the one you were looking for. What if we make it a three shot, right? What if we make it so that you got to search two or three times and then every time we get to show you ads? And there's this guy at Google, Ben Gomes, who's a Googler from the OG days. He was building their first servers when it was like one computer under a desk. He oversaw the build out to all of the data centers all over the world and he's in charge of search technology and he's like, what are you talking about? This is a terrible idea. And you know, historically I think that guy would have won the argument not because the Google founders had a sense of holy mission. Maybe they did and maybe they didn't, but because Google understood that as they used to say, competition is just a click away. But at that point, Google had spent many years bribing Apple to the tune of $20 billion a year not to make a competing search engine. They had bribed all the browsers and all of the mobile carriers and all the people who made operating systems to make Google the default search. Even Microsoft's browser was just a rebadged version of Chrome. And so everywhere you looked, there was Google search and they could make it worse and it wouldn't matter. Because people had nowhere else to go and they could turn the screws tighter and tighter and extract more of our attention until we eventually flee elsewhere, which may be happening now with open AI and generative search like that in some ways is delivering a more convenient product even though it is selfless. Not necessarily a better one. Yeah. Not right, but faster. One of the things about open AI that that story tells you, because that search is not good, is that people are willing to use perplexity in open AI instead of search because search is so degraded, right? It's only good in comparison. And here's the kicker. I use a search engine. I'm not affiliated with them in any way, but I use a search engine called Cagi. And I was just amazed by how good they were the first time I used them. I was staying with my fiction editor and I was in his living room on my computer and so was he. And he was like, have you ever tried Cagi? And I'm like, Cagi, no, tell me more. And he's like, it's like Google used to be when Google started. And I was like, that's amazing. It's 10 bucks a month. I tried it for like 10 seconds and I'm like, okay, I'm buying it. And then I read this article by Jason Kevlar in 404 Media about Cagi. He's using it too. And he's like, by the way, they're using Google's index and they're syndicating and reorganizing its results. Now Cagi, they seem like very smart people, but I don't know how many people work there. 10, 20, maybe like, I don't know. It's certainly not 100. But you can't tell me that Google cannot do a better job with its own search index than what eight guys in a garage? It just goes to show that the product doesn't have to be so bad. Oh, it's a choice. Like the core good product is in there somewhere. We're just not allowed to access it because of all these layers of tweaking and mediation. You define this very specific process or like the stages of things getting worse, as we can say euphemistically. And I was hoping you could go through those for us. Sure. In shitification, describes a three-stage process by which platforms go bad. In stage one, the platform is good to its end users, but it finds a way to lock those end users in. There's different ways based on different platforms. Uber chased all the other cab companies out of the market. So that's one way to get locked in. It's very complicated. Facebook had a much easier one, which is like once you're in a place with a bunch of friends, it's really hard to organize them all to go. Economists just call that the collective action problem. You love your friends, but they're a pain in the ass. You can't agree on what board game to play this weekend, much less like when it's time to leave Facebook, especially if some of you are there because that's where you hang out with the people at the same rare disease as you or your customers or the people you left behind when you moved or the people organizing the little league carpool. So it's really hard to go. And so you have people locked in. Once they're locked in, the platform is worse to those end users in order to be good to business customers. It brings in advertisers, publishers, taxi drivers, platform sellers, performers, sex workers, whoever it is that they are brokering the introduction between. And this is where I think a lot of other critiques stop. They'll say, oh, if you're not paying for the product, you're the product. That there's a kind of conspiracy between, say, advertisers and Facebook to screw end users. But what actually happens after the business customers are locked in is they get screwed too. And so the platforms start to squeeze their business customers and they try to reach a kind of equilibrium where all the value except for whatever kind of homeopathic residue is needed to keep people locked in and to keep business customers locked into those people, all that value has been extracted, given to shareholders, given to executives. And that is like the final stage of identification. And that's where I think we find ourselves now with a lot of platforms. They're not the minimum viable product. They're like the maximally identified product. I think you described that third stage in the book as something like a giant pile of garbage. Like everyone is getting abused. I'm sure that wasn't garbage, but yes, indeed. Yes. Yes. So no one is winning except for the platform. Like in this model, the users have been attracted in. We like the features. We like the product. That's cool. Then the businesses come in and they're making money too. And everyone is happy for a little while. Maybe a year in this utopian situation of the platform. And then the screws start to turn and everyone is suffering except for the platform and its executives. So in those stages, the people who are benefiting at the end are Mark Zuckerberg or Elon Musk of X now. What does that end stage look like that we're living in right now? Well, it's not new. They didn't invent greed in like 2019, nor did they invent the ways that tech platforms can change the rules from moment to moment that allows this in shitification. People have been remotely downgrading platforms and technologies for as long as they've been around. What changed is that the platforms don't lose to competitors when that happens. So Mark Zuckerberg, when he launched Facebook, the thing he offered to the general public in 2006 where you didn't have to have a .edu address to join it. His pitch was, sure, you were all using MySpace, but did you know that it's owned by an evil crapulence in SNS and Australian billionaire named Rupert Murdoch, who's spying on you with every hour that God sends, come to Facebook will never spy on you and will only show you the things you asked to see. We're not going to boost stuff into your field of vision that you didn't ask for. And it wasn't enough to bring people over from MySpace. MySpace users had a collective action problem too. The difference was that back then IP laws hadn't been monotonically expanded in the way that they have in the last 20 years. So Mark Zuckerberg was able to give MySpace users a scraper. You gave that scraper your username and password, this bot, and it would pretend to be you at MySpace several times a day, grab everything in your feed that was waiting to be shown to you and it would put that in your Facebook feed. You could reply to it there and then it would push it back out to MySpace. So you didn't have a collective action problem. You could just move from one to the other. Now 20 years later, if you try to do that to Facebook, they'll say you violated section 1201 of the Digital Millennium Copyright Act and patents and trademarks and copyrights and trade secrets. We've kind of rigged the game so that history ended with the current round of winners, that no one can do unto them as they did unto their predecessors. Corey Doctorow, speaking with New Yorker columnist Kyle Chaka. More in a moment. How does AI even work? Where does creativity come from? What's the secret to living longer? Ted Radio Hour explores the biggest questions with some of the world's greatest thinkers. They will surprise, challenge, and even change you. Welcome to NPR's Ted Radio Hour, wherever you get your podcasts. In the book, you are not a total pessimist and like you're far from pessimistic about this actually. It's not just that everything is getting more and more and shitified and worse and worse and there's nothing to do about it. There are strategies and there are ways to like make that lever of ensitification harder to use. Maybe you can talk about how there are these political ideas to fix this, but perhaps under Trump right now we're not seeing so much action. Well, yeah. You know, I mean, there is something quite miraculous about antitrust in the last six or seven years, even under Trump. The case that Google just lost started under Trump. What's miraculous about it is that it's happened all over the world. It's easy to think of this as being like just a thing Joe Biden did. I don't think Joe Biden actually did it. I think that Joe Biden was cornered into it by elements of his party who were sick to the back teeth of concentrated power and wealth and who in order to keep that coalition together, he had to do something, but it wasn't just Joe Biden who was cornered into that. All the best Americans, I'm Canadian. In Canada, we have this very, very weak competition regime. Our competition bureau had challenged three mergers in its whole history, but it succeeded zero times. Yet Justin Trudeau, again, hardly an enemy of concentrated corporate power, whipped his party to pass big muscular antitrust law that created new powers for a competition bureau. We've seen very big antitrust action in the EU and in EU member states like Germany and France and Spain, but also in Australia, South Korea, Japan, and even China. This is happening everywhere. Trump, he's trying to stop it. The reason Trump did it in 2019 when he brought the case against Google was not merely that he was petulant about big tech. It was that there was this giant political tailwind for doing something about concentrated power, about monopoly. That came from you and me. It came from people who are living out what the finance sector calls Stein's law, that anything that can't go on forever eventually stops. That power is still abroad in the world. We're seeing so much more activity and action in the EU, in the UK, as far as passing these packages of regulations. They are impacting how American users experience technology too. There's this trickle down effect or trickle across. Can you talk about that a little bit? Some of that is already happening, as you say. The European Union, for example, said Apple, we don't care how much money you make selling lightning chargers. From now on, everything sold in the European Union, unless there's a damn good reason, is going to have a USBC charger. We're not going to keep filling our e-waste dumps with immortal garbage that has proprietary dongles on it. Apple clatched and they complained, but they did it. I guess decided that it was logistically, transcendentally hard to do this in a way where they would split the manufacturing run and they would send the USBC ones to Europe and they'd send the lightning port ones to America and everyone else. Everybody else got it. The Digital Markets Act, which came into effect in 2024, goes a lot further and it imposes these interoperability requirements on companies. They can't lock rivals out of the platform. They can't say, oh, well, we think you might invade our users' privacy or consumer rights, so we're not going to let you in. In Europe, what they say is, well, we have a privacy law here, unlike in America. America hasn't had a new privacy law since Ronald Reagan banned video store clerks from disclosing your VHS rentals in 1988. Other countries have mature muscular privacy laws and they say, we'll decide that. If a company plugs into an Apple device and invades someone's privacy, we'll take care of that. You can stand down, Tim Cook. Apple is so upset about this that they threatened to leave the EU over it, which is not going to happen. In fact, after that was reported, they were like, oh, no, no, no, that's not what we were threatening at all. We're just sad. This is more in sorrow than in anger. We hope you'll see the error of your ways. Apple is not going to walk away from 500 million affluent consumers. That's what the stuff is going to go through. It's easier to make this singular experience. It's easier to make one product for everyone and then that product has to tow some line if the laws are strict enough in order to reach enough consumers. I was thinking also about AI and I'm curious what stage of ensuitification you think generative AI is in right now. Whatever AI can or can't do, the reason it has attracted hundreds of billions of dollars in investment capital is because the market is betting that you can fire workers and replace them with chatbots. I don't think you can. I don't think that, not only do I not think you can now, I don't think you will be in the future. I think that there are lots of ways in which when workers are in charge of how they use AI, they might make their job better. They might be better at their job. They might be happier. But I don't think that bosses firing workers and replacing with AI is going to work and I don't think that shoveling more words into the word guessing program will make that happen. I think that's like saying, we keep breeding these horses to run faster and faster. It's only a matter of time till one of the mayors gives birth to a locomotive. A person is not like a word guessing program with more words. I think because identification is about a service that works getting worse and AI is kind of a service that was just a bunch of flashy demos, that it's in really a different space. Although it is interesting to note that all of the promising avenues for improvement according to AI bosses involve doing a lot more AI queries and having that happen automatically through these things called routers that take what would have been a query that cost you one and turn it into 20 queries that each cost you a sum but that you have no insight into. You don't get to choose how that query is broken up and subbed out to these different kinds of models at different prices, which even if they're not ripping you off now, which I'm 100% sure they are, they will rip you off in the future with. They just have a black box where it's just like you give them a credit card and then after you ask a question, they tell you how much they've charged your credit card. Why wouldn't they abuse that power? Yeah, they're already extracting more value from the user as much as they can. Right now, we're seeing open AI rollout advertising friendly products. Advertising in your book seems to be the harbinger of the worst kinds of identification and is the model that underlies much of the internet right now. Actually, I want to quibble with that a little. I was the co-editor for many years and I'm still the co-owner of a website called Boing Boing, which is one of the first big advertising supported blogs. Our advertisers only made us invade our users' privacy to the extent that our users' privacy was invadible. When pop-up blockers became normal, advertisers stopped asking for pop-ups. When ad blockers became normal, advertisers became less interested in invading people's privacy. I think that if we banned surveillance advertising and just went back to contextual advertising, that advertisers would say, oh, well, we're not going to advertise anymore, but when the time comes, of course they're going to advertise. There's no way companies are not going to pay advertising firms to tell other people about their products. It's a completely ridiculous thing to claim. Back to the thesis of the book, the policy environment creates inshidification. The inshidificatory environment creates the regime in which bad impulses, bad people, bad ideas thrive. We have to make a hostile environment for inshidification. We are long past the day where we should be updating our privacy law. At the Electronic Frontier Foundation, we have this campaign called Privacy First, where it's like if you're angry because grampy is a QAnon, you think Facebook brainwashed them, or you think the reason your teenager is anorexic is because Insta brainwashed her, or you think the reason the millennials in your life are quoting Osama bin Laden is because TikTok brainwashed them, or if you're angry about cops using reverse warrants to round up protesters at anti-ice demonstrations or the January 6 riots, or if you're angry about kids being followed into abortion clinics by their phones, or if you're angry about someone making deep fake porn of you, or if you're angry about people being racially discriminated against when they get a loan or get a job or get a mortgage, what you're really angry about is privacy, right? This is all surveillance. And the coalition for this is so big, and it crosses so many political lines that if we could just make it illegal to spy on people, we could solve so many problems. Like that would break the whole model that's going on right now. Yeah. Oh, Corey, thank you so much for coming on. It was a pleasure to talk to you. The pleasure is very mutual. Thank you for having me on. Corey Doctorow is a writer and former European director of the Electronic Frontier Foundation. He spoke with staff writer Kyle Janker. Kyle's column, Infinite Scroll, publishes weekly at NewYorker.com. And you can subscribe to The New Yorker there as well, NewYorker.com. I'm David Remnick. Hope you enjoy the show. Next week, John Stewart joins us in a conversation from the New Yorker Festival. Don't miss it. The New Yorker Radio Hour is a co-production of WNYC Studios and The New Yorker. Our theme music was composed and performed by Meryl Garbys of Tune Yarns, with additional music by Louis Mitchell. This episode was produced by Max Balton, Adam Howard, David Krasnow, Jeffrey Masters, Louis Mitchell, Jared Paul, and Ursula Summer. With guidance from Emily Boteen and assistance from Michael May, David Gable, Alex Barisch, Victor Guant, and Alejandra Deccan. The New Yorker Radio Hour is supported in part by the Charina Endowment Fund. Every day, WNYC Studios is working to get closer to New York and to New Yorkers. The underwriting we get from businesses helps power our independence. Learn how your organization can join in at sponsorship.wnyc.org.