Better Offline

Monologue: AI Isn't Replacing Software Companies, Calm Down About Claude Code

8 min
Feb 20, 2026about 2 months ago
Listen to Episode
Summary

Host Ed Zitron argues that the recent stock market sell-off driven by fears that AI will replace software companies is based on flawed assumptions. He contends that building internal software clones using Claude Code ignores the massive hidden costs of maintenance, compliance, security, and infrastructure that established SaaS providers handle, and that the entire AI boom is unsustainably subsidized.

Insights
  • SaaS pricing includes hidden value: maintenance, compliance, security, and infrastructure management that companies underestimate when considering building internal alternatives
  • AI-generated code lacks intentionality and documentation, creating long-term technical debt and maintainability problems as codebases scale
  • Current AI capabilities are heavily subsidized by vendors, making real-world economics unsustainable at scale and masking true operational costs
  • Media narratives around AI capabilities are often disconnected from practical business realities and driven by marketing incentives rather than technical accuracy
  • The sustainability of AI services depends on pricing models that are currently not economically viable for widespread adoption
Trends
Unsustainable economics of AI services: current pricing and subsidies cannot support long-term viability at scaleGrowing disconnect between media narratives and technical reality in AI coverageIncreasing recognition of hidden costs in software maintenance and compliance (SOC 2, GDPR, security)AI-generated code quality concerns: lack of intentionality, documentation, and long-term maintainabilityMarket correction expected when AI subsidies end and real costs are reflected in pricingHomogenization of software products due to shared training data in AI code generationRisk of significant media credibility loss when AI bubble deflatesShift from evaluating AI by capability demos to evaluating by sustainable business economics
Topics
AI Code Generation and Software DevelopmentSaaS vs. Build-Your-Own Software EconomicsSoftware Maintenance and Technical DebtCompliance and Security Requirements (SOC 2, GDPR)AI Subsidization ModelsMedia Coverage of AI TechnologyClaude Code Capabilities and LimitationsSoftware Infrastructure and DevOpsLong-term Software MaintainabilityAI Bubble EconomicsEnterprise Software Replacement NarrativesCode Quality and IntentionalityBusiness Television and Financial Media AccuracyAI Pricing SustainabilitySoftware Product Homogenization
Companies
Salesforce
Used as primary example of enterprise software that companies are falsely believed could be easily replicated with Cl...
Microsoft
Mentioned as major software company whose products (Microsoft 365) companies might attempt to build internally
Anthropic
Host is writing critical analysis piece about the company; discussed regarding unsustainable API token economics
Notion
Listed as example SaaS product that companies might attempt to clone using AI code generation
Trello
Mentioned as example of simple tool that Claude Code might generate, but not complex enterprise software
Monday.com
Listed as example SaaS product companies might attempt to build internally instead of purchasing
Asana
Mentioned as project management software companies might attempt to replicate with AI code generation
Spotify
Referenced in iHeart ad regarding podcast listening statistics compared to streaming music
Pandora
Referenced in iHeart ad regarding podcast listening statistics compared to streaming music
iHeart
Podcast network hosting the show; mentioned as largest podcaster in promotional content
People
Ed Zitron
Host of Better Offline; delivers monologue criticizing AI hype and media narratives around software replacement
Sam Altman
OpenAI CEO referenced as pushing marketing narratives that media uncritically amplifies
Quotes
"When you pay a software company, even a dogshit mediocre one, a monthly fee, you're not just paying them to access the software, but to take away the burden of maintenance that comes with running a software company."
Ed Zitron
"You're being played. You're being conned. And by extension, you're conning your listeners, your readers, and your viewers."
Ed Zitron
"Everything you were using is being subsidized. It's being subsidized by these companies in the hopes that it kind of bakes into your life, except it's not good enough to do that."
Ed Zitron
"Just more software existing doesn't mean the software's functional or useful."
Ed Zitron
"At some point, you're just looking at a kind of a rat's nest of crap, of code written unintentionally, spooged out, meaningless."
Ed Zitron
Full Transcript
This is an iHeart Podcast. Guaranteed human. Run a business and not thinking about podcasting? Think again. More Americans listen to podcasts, then add supported streaming music from Spotify and Pandora. And as the number one podcaster, iHeart's twice as large as the next two combined. Learn how podcasting can help your business. Call 844-844-iHeart. CallZone Media. Hello and welcome to this week's Better Offline monologue. I'm your host, Ed Zitron. So in the last few weeks, there's been a dramatic sell-off in software stocks driven by the anxiety that companies will, instead of paying for someone like Salesforce and Microsoft, simply build their own software. It is a genuinely stupid assumption based on analysts and reporters that simply do not care about the truth. In their mind, one can simply type build me Salesforce now into Claude Code and have it barf out an identical functional clone that's compliant, secure, stable, all because somebody was able to bonk it on the head enough times to spit out something that sort of looked like a tool like Trello maybe, or a personal website. Look, when you pay a software company, even a dogshit mediocre one, a monthly fee, you're not just paying them to access the software, but to take away the burden of maintenance that comes with running a software company. Minor things like currency changes or time zone shifting can cause major problems in systems that aren't built with intention. You know, like something in LLM would spit out. And things get even more complicated when you start connecting other systems of record, like billing or a customer's personal information, especially if they're in Europe, by the way. Plan to have any of that information actually connect with a customer's systems? Well, you're going to need a SOC 2 audit, and you're definitely going to need to make sure it's got rock-solid security so that nobody can swipe all of that data and then you get sued. I also assume you're going to effectively take an engineer off of one of your teams, probably for good, to be honest, to maintain your new internal Salesforce, Monday, Microsoft 365, and Notion clones, your Trello clone as well, probably your Asana. I mean, how much software are you going to build? Good thing you've got clawed code to help speed that up, right? Just make sure you read everything it writes, because every little fuck just became your problem and you got nobody to scream at because while your company is saving per seat per month you also fired the people whose job it is to make sure your nasty little software subscriptions actually fucking function And while you may fear that a boss might try and force this down your throat to save money, I must be clear how impossible this task is. Even the most annoying, frustrating Suffer-as-a-Service contract protects you from the grueling underlying maintenance and infrastructural bullshit that goes into making sure the thing you're paying for actually loads and functions wherever you load it, and even on the browser that you want to load it in. In most cases, thank you, Riverside. The people pushing this narrative are either fundamentally disconnected from how the world works or actively incentivized to mislead you. I've seen this narrative propagated on multiple different business television shows and supposedly respectable outlets, and it makes me genuinely worried that we don't have a media industry prepared to dissect fundamentally deceptive narratives. Just because it's possible for a non-coder to cobble together a website that looks near identical to a model's training data in the space of an hour doesn't mean that we're replacing every software company, nor is its ability to do so any indicator that we'll be able to do more than that in the future. I plead with the media, please stop filling in the gaps. Please stop seeing every incremental improvement as proof of whatever marketing slot Wario Amadei or Clammy Sam Altman is trying to cram down your throats. You're being played. You're being conned. And by extension, you're conning your listeners, your readers, and your viewers. And once the bubble pops, I believe they will demand an explanation from you. I certainly will. And really, in this era, I think people are underselling how big a reckoning there will be when the bubble collapses. How are we meant to trust anybody who vociferously pushed this or who got obsessed with AI once all of this falls apart? When I say falls apart, I mean that the current rates that you are paying are not even close to being sustainable. I'm currently working on a piece called The Hater's Guide to Anthropic on my premium newsletter. And I found a mathematical study that found that on a a month Claude subscription you can spend over of Anthropix API tokens And on the a month about to I guess it takes money to lose money But this is the reality of the AI bubble. Everything you were using is being subsidized. It's being subsidized by these companies in the hopes that it kind of bakes into your life, except it's not good enough to do that. The frenzied media push under clawed code existed to kind of beguile you, to make you think that there's more happening than a tool that might be able to kind of build a website sort of or build you some half-functional software that maybe works sometimes. And when it comes to actually building that into something you could sell, you're shit out of luck. You actually do need to code. And even the things that you vibe code aren't really secure or functional. And even that mediocre web slop you're seeing spooged out by clawed code is heavily subsidized. if people had to pay the real rates those people you see jacking off on twitter about clawed code they'd be paying 200 plus dollars a day do you think they'd actually pay that do you think that's actually happening because i'm sick and fucking tired of hearing all the people rambling about clawed code writing all the goddamn code i'm sick and tired of it because i don't see what the end point is i don't hear an actual result from this i don't think people should be laid off but if this was doing the thing that they were saying it would, that would be happening and it would be definitively connected to AI or more software would be being shipped that's actually good. Just more software existing doesn't mean the software's functional or useful. And if you look at Vibe-coded apps, they all kind of look the same. And that's a result of them all using the same training data. And it's why these things are not really good at building nuanced or unique software because that's not what they do. They copy software that's already been written, and they do so with no intention, no real plan, and nothing you can look back on and say, oh right, that's why they built it this way. And perhaps that's not a problem as you're a solo person building your own diddly little app, but expand that to a thousand people or 10,000 people. Let's say five, ten years passes, and let's take one of these, what I think are bullshit statements about how, oh, 90% of our code's written by AI. Great. What happens in three years or four years when you go and look back at that you go shit why did we do that Oh the person left Or the person didn leave they just been drinking heavily and they don really remember what they were prompting the model with And oh man how do we actually see why they built this or what the reason was This becomes a huge problem when things start breaking or when you have to make things more efficient. And then people will say, oh, we use AI to fix that. At some point, you're just looking at a kind of a rat's nest of crap, of code written unintentionally, spooged out, meaningless. How do you fix that long-term? How do you build an efficient software package out of that? How do you build software that continues to run well into the future? Is the idea that you just build AI on AI? God, no. Anyway, I think once the real costs are charged, people are going to drop this shit like a bad habit. I can't wait. I'm going to be smug about it. I'm going to be annoying about it. I'm not going to lie. but I think based on the YouTube comments you're all going to join me now next week we've got more hater season and I'll be back with one of these monologues maybe or perhaps I'll give you multiple interviews done some weird ones got Victoria Song on wellness got Adam Becker on Epstein I mean got all sorts of shit coming looking forward to bringing you more and I love you all When segregation was a law, one mysterious black club owner, Charlie Fitzgerald, had his own rules. Segregation in the day, integration at night. It was like stepping on another world. Was he a businessman? A criminal? A hero? Charlie was an example of power. They had to crush him. Charlie's Place from Atlas Obscura and visit Myrtle Beach. Listen to Charlie's Place on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, everyone. It's Emily Simpson and Shane Simpson from the Legally Brunette podcast. Each week, we're bringing you true crime through a legal lens. Whether you want all the facts on the disappearance of Nancy Guthrie, or you still need to wrap your head around the ditty verdict, we're breaking it all down step by step. And we're not just lawyers, we're also husband and wife. It makes for some pretty entertaining episodes. Listen to Legally Brunette on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. This is an iHeart Podcast. Guaranteed human.