This is an iHeart podcast. Guaranteed human. Run a business and not thinking about podcasting? Think again. More Americans listen to podcasts, then add supported streaming music from Spotify and Pandora. And as the number one podcaster, iHeart's twice as large as the next two combined. Learn how podcasting can help your business. Call 844-844-iHeart. Hello and welcome to this week's Better Offline monologue. I'm your host, Ed Zitron. Now this week's monologue is about something I also just wrote up in my newsletter. A new term I came up with called Analyslop. Now Analyslop is when somebody writes a long, specious, but authoritative sounding piece of writing with few facts or actual statements with the intention of it being read as thorough analysis. Researched analysis indeed. Now this week, alleged research group Citrini Research, not to be confused with Andrew Left, Citron Research, or well, me, put out a truly awful piece of writing called the 2028 Global Intelligence Crisis, which is this slop-filled scare fiction written and framed with the authority of deeply founded analysis, so much so that it caused the global sell-off in stocks. and I must be clear, if you took this seriously, if you listened to this on, I don't know, you had it read out by LM notebook or you read it and you kind of furrowed your brow and like, this is really provocative, you're a mark, you are being conned, you are in the process of being conned if you haven't already been. Now, this piece, if you haven't read it, please do go and read, it'll be linked in the notes, my annotated version. Those of you in YouTube won't get the link just go to my blue sky i swear to god it's there it spends 7 000 words telling the dire tale of what would happen if ai made an indeterminately large amount of white collar workers redundant now you read this 7 000 boring fucking word long piece it isn't clear what ai exactly does who makes the ai how the ai works really anything about it just that it replaces people and then bad stuff happens. Citrini insists that this isn't bear porn or AI Duma fan fiction, but that's exactly what it is. Mediocre analysis slop framed in the trappings of analysis sold on a substack with research in the title, specifically written and successfully spooking and ingratiating anyone involved in the financial markets who wants to be scared of this stuff. Its goal is to convince you that AI non is scary that your current stocks are bad and that AI stocks unclear which ones those are by the way are the future Also find out more for a year I tried 70 bucks and my shit's way better than this fuckware. Now, look, look, okay. Let me give you an example. I'm gonna do a voice. It should have been clear all along that a single GPU cluster in North Dakota generating the output previously attributed to 10,000 white-collar workers in midtown Manhattan is more economic pandemic than economic panacea. Now, he writes economic correctly. I just fucked up saying it. Now, the goal of a paragraph like this is for you to say, wow, that's what GPUs are doing now. They're doing that today. It isn't, of course. The majority of CEOs report little or no return on investment from AI, with a study of 6,000 CEOs across the US, UK, Germany, and Australia, finding that more than 80% detected no discernible impact from AI on either employment or productivity. Nevertheless, you read GPU in North Dakota and you think, wow, that's a place I know, and I know that GPU's power AI. Now, I'm a curious little critter, so I know a GPU cluster in North Dakota. Coreweb's one with Applied Digital that has debt so severe that it loses both companies' money, even if they have the capacity rented out 24-7. But let's not let facts get in the way of a poorly written story. Now, I don't need to go line by line on this. I'm not going to because I'll end up saying a legally actionable threat of some kind, but I need you to know that most of this piece's arguments come down to magical thinking and utterly empty prose. For example, how do you think that AI takes over the entire economy? And I quote, AI capabilities improved, companies needed fewer workers, white collar layoffs increased, displaced workers spent less, margin pressure pushed firms to invest more in AI, AI capabilities improved. That's right, they just get better. I don't need to discuss anything happening today. Even AI 2027 had the balls to start making stuff up about open brain or some such bullshit. This piece literally just says stuff, including this particularly egregious lie I'm about to read to you. In late 2025, agentic coding tools took a step function jump in capability. A competent developer working with Cloud Code or Codex could now replicate the core functionality of a mid-market SaaS product in weeks. Not perfectly or with every single edge case handled, but well enough that the CIO reviewing a 500,000 annual renewal started asking the question, what if we just built this ourselves Now quick thing what annual contract There are tons of SaaS contracts though far longer than that But this is also a complete and utter lie A ball lie This is not something that Claude Code can do The fact that we have major media outlets quoting this piece suggests that those responsible for explaining how things work don't actually bother to do any of the work to find out. And it's both a disgrace and an embarrassment for the tech and business media that these lies continue to be peddled. As I went over in last week's monologue, the entire AI-replacing software story is a con. It's clear that the markets and an alarming amount of people in the media simply do not know what they are talking about or are intentionally avoiding thinking about it. The AI replaces software story is literally Anthropik has released a product and now the resulting industry is selling off. Such as when it launched a cybersecurity tool that could check for vulnerabilities, a product that's existed in some form for nearly a decade. and this caused a sell-off in cybersecurity stocks like CrowdStrike. You know, the one that had a faulty bit of code, caused a global cybersecurity incident that lost the Fortune 500 billions and resulted in Delta Airlines having to cancel over 1,200 flights over a period of several days. There is no rational basis for anything about this sell-off other than that our financial media and markets do not appear to understand the very basic things about the stuff they invest in or talk about. Software may seem complex, but especially in these cases, it's really quite simple. Investors are conflating an AI model can spit out code with an AI model can create an entire experience of what we know as software or is close enough to that we have to start freaking out, which it isn't. It obviously isn't. Anyone who builds software knows this. Anyone peddling this is just wrong. And it's this replet vibe coding shit. I swear to God, people in the media use it and they just go, wow, this is the future now. All software will be built like this. As they fail to build any... Calm down it. Calm down the software, you know. I think we really need to think deeply about how, for the second time in a month, the markets and the media have had a miniature shit fit based on blogs that tell lies using fan fiction. As I covered in my annotations of Matt Schumer's bullshit something big is happening piece, the people that are meant to tell the general public what's happening in the world appear to be falling for ghost stories that confirm their biases or investment strategies, even if said stories are full of hearth, truths, and outright lies. I'm despairing a little. When I see Matt Schumer on CNN, or hear from the head of a PE firm about Citrini research, I begin to wonder whether everybody got where they were not through any actual work or knowledge but by making the right noises We in a grifter economy and the people that should be stopping the grifters are asleep at the fucking wheel. I'm Amanda Knox, and in the new podcast Doubt, the case of Lucy Letby, we unpack the story of an unimaginable tragedy that gripped the UK in 2023. But what if we didn't get the whole story? The evidence has been made to fit. The moment you look at the whole picture, the case collapsed. What if the truth was disguised by a story we chose to believe? Oh my God, I think she might be innocent. Listen to Doubt, The Case of Lucy Letby on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Clayton Eckerd. In 2022, I was the lead of ABC's The Bachelor. But here's the thing. Bachelor fans hated him. If I could press a button and rewind it all, I would. That's when his life took a disturbing turn. A one-night stand would end in a courtroom. The media is here. This case has gone viral. The dating contract. Agree to date me, but I'm also suing you. This is unlike anything I've ever seen before. I'm Stephanie Young. Listen to Love Trapped on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. What if mind control is real? If you could control the behavior of anybody around you, what kind of life would you have? Can you hypnotically persuade someone to buy a car? When you look at your car, you're going to become overwhelmed with such good feelings. Can you hypnotize someone into sleeping with you? I gave her some suggestions to be sexually aroused. Can you get someone to join your cult? NLP was used on me to access my subconscious. Mind Games, a new podcast exploring NLP, a.k.a. neurolinguistic programming. Is it a self-help miracle, a shady hypnosis scam, or both? Listen to Mind Games on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. You can scroll the headlines all day and still feel empty. I'm Ben Higgins, and If You Can Hear Me is where culture meets the soul. Honest conversations about identity, loss, purpose, peace, faith, and everything in between. Celebrities, thinkers, everyday people, some have answers. Most are still figuring it out. And if you've ever felt like there has to be more to the story, this show is for you. Listen to If You Can Hear Me on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. This is an iHeart Podcast. Guaranteed human.