Radiolab

Content Warning

29 min
Oct 17, 20256 months ago
Listen to Episode
Summary

This episode explores how social media platforms have fundamentally shifted their content moderation approaches, moving from reactive removal to proactive algorithmic curation. Host Simon Adler interviews legal scholar Kate Klonick about how TikTok's model has influenced Facebook, X, and others, transforming these platforms into controlled broadcast networks rather than public squares, with significant implications for free speech and political power.

Insights
  • TikTok's content moderation model (pushing curated content rather than pulling down violations) has become the industry standard, replacing Facebook's reactive approach and creating 'platform islands' with distinct ideological ecosystems
  • Modern content moderation is now a vector for political and economic power—controlling what appears in feeds shapes public opinion at scale, making it as valuable as traditional media ownership
  • The shift from filter bubbles to platform islands means users self-select into ideologically aligned spaces rather than being algorithmically confined, fragmenting the shared information landscape
  • Prior restraint (preventing content from appearing) is more insidious than post-publication removal because users never know what they're missing, making algorithmic suppression harder to detect than censorship
  • Social media platforms increasingly resemble broadcast networks camouflaged as organic spaces, raising questions about whether traditional media regulation frameworks should apply to tech platforms
Trends
Shift from reactive to proactive content moderation strategies across major platformsConsolidation of media power among platform owners who directly control algorithmic amplificationRise of 'platform islands' where users self-segregate into ideologically distinct digital spacesAutomation of content moderation replacing human review and trust-and-safety teamsBlurring of lines between social media platforms and broadcast networks in terms of editorial controlPolitical parties and governments recognizing content moderation as a tool for shaping public opinionDecline of shared information ecosystems and rise of personalized, emotion-optimized content feedsRegulatory gap: existing media regulation frameworks not applied to tech platforms despite similar power dynamicsJournalists reporting platform activity as real-world events despite platforms being controlled editorial spacesEconomic incentive alignment: platforms profit from engagement regardless of informational accuracy or societal impact
Topics
Content Moderation Policy EvolutionTikTok Algorithm and Censorship ModelFirst Amendment and Private PlatformsPlatform Regulation and Media LawAlgorithmic Amplification and Shadow BanningPrior Restraint in Digital SpacesPolitical Weaponization of Content ModerationFilter Bubbles vs Platform IslandsFact-Checking Program EliminationBroadcast vs Public Square MetaphorsMisinformation and Election IntegrityHunter Biden Laptop CensorshipCOVID-19 Lab Leak Narrative ControlElon Musk and X Platform ControlPersonalized Content and Emotional Manipulation
Companies
TikTok
Central case study for proactive content moderation model that pre-screens and curates content rather than reactively...
Facebook
Discussed for pioneering reactive content moderation, fact-checking programs, and recent shift toward TikTok's approa...
Meta
Parent company of Facebook and Instagram; announced elimination of fact-checking program in January 2025
X (formerly Twitter)
Described as mirror image of TikTok, deliberately amplifying emotionally reactive content under Elon Musk's ownership
Instagram
Meta platform discussed alongside Facebook regarding content moderation policy changes
YouTube
Mentioned as platform where journalists source news content and where content moderation decisions shape public perce...
Blue Sky
Emerging platform where journalists and users migrate, discussed as alternative to X
New York Post
News outlet whose Hunter Biden laptop story was suppressed on Facebook and Twitter before 2020 election
People
Kate Klonick
Professor at St. John's Law School and expert on social media content moderation policy; primary interview subject
Simon Adler
Radiolab host and reporter who conducted the episode and has covered content moderation issues over multiple years
Mark Zuckerberg
Facebook/Meta CEO who announced elimination of fact-checking program and shift toward community notes in January 2025
Elon Musk
X owner whose algorithmic amplification and direct content decisions exemplify platform owner editorial control
Molly Webster
Radiolab co-host who appears in episode intro discussing snail-themed content and membership program
Quotes
"TikTok comes from obviously China and it comes from a censorship, kind of authoritarian CCP culture. And I mean, I believe the Chinese kind of approach to speech is very reflected in the algorithm that TikTok uses."
Kate Klonick
"It is a we get to determine what people see and say. And that that's it. So they're just taking tons and tons and tons of stuff down."
Kate Klonick
"The ultimate in censorship in American First Amendment law is really prior restraint... Whereas with TikTok, you never even know what you missed. You never even know what you were kept from seeing."
Kate Klonick
"Once you have that, once you can control the emotions of people, with the flip of a dial by putting something in front of them that are going to only peak that feeling for them, then you could just control everybody."
Kate Klonick
"If there are if we're talking about like the same three main places that people are going to for their news... If that is like three people and they're all friends of the president, like that's that's a problem."
Kate Klonick
Full Transcript
Hey, I'm Molly Webster. Hey, I'm Mona McGowker. Mona and I just made a snail episode. It's called Snail Sex Tape. And we have not stopped talking about snails for like months. We've become deeply obsessed with snails. I think we should all get snail tattoos. Ooh, snail tattoo could be cute. But you know what you can get instead of a snail tattoo. What? You can get an enamel snail pin in honor of our snail sex tape episode. I've never been more honored in my life. I know. It is based on a real medieval snail miniature. I will be rocking it on my jean jacket all spring long. So to get one of these pins, you have to join the lab. And when you join the lab, in addition to helping fund our show, you get access to sponsor-free podcasts, plus monthly bonus content, plus invitations to events with the team. Including an AMA that we're going to be doing next month, you and me, about the behind the scenes of making snail sex tape. Behind the shell. BTS. All you have to do is go to radiolab.org slash join. And if you use the code word snail, you get two months off the first year of an annual membership. Get your pin. And we can't wait to see you guys next month. Thanks, everyone. Wait, you're listening. OK. All right. OK. All right. You're listening to Radiolab. From WNYC. The six. See? Yeah. You're right in here. Awesome. You're going to be speaking in that microphone. Just go ahead. Nope, the one close right here. Hey, I'm Simon Adler. This is Radiolab. 1, 2, 3, 4, 5. Can you hear me, Kate? Yep. And that is Kate. Yeah. Kate Klinick. I'm a professor at St. John's Law School. I've talked to her a bunch over the years. We did a couple different stories that felt like news at the time about Facebook's rules for what we can and can't post on their platform. Don't get me saying the F word again, because last time my parents yelled at me. Did they? Yeah, they were like, Kate, you're an adult now. Oh, come on. You're a serious person. I prefer to swear on the radio as much as possible. We covered the origins of these rules and just how complicated they can become. But beyond the specifics, what we were really exploring was how the ideal of free speech plays out in different spaces in our society. You know, from a good old public square where anyone can say anything they want to lightly regulated broadcast TV to straight up private spaces. And we were asking, like, where does social media fit into all that? And, you know, I kind of thought we were done talking about all this. But then. I'm happy we still have a show to, I guess, this past month. Jimmy Kimmel can't say that anymore. The late night host taken off air indefinitely. As we all know, free speech was in the news again. I mean, look, we can do this the easy way or the hard way. That's censorship. That's state speech control. And these questions of who can say what, where, and how much pressure the government can or can't exert just felt fresh and vital all over again. And so I called Kate, yeah, to see how this is all playing out online. Yeah. And now it is a problem of, OK, how do we stop billionaires and authoritarian governments from twisting these platforms into censorship machines or political propaganda? OK. I know, that's kind of how I feel, too. Well, I guess before we get into all of that, let's build a bit of a foundation first. Sure. So I guess how has the actual practice of keeping stuff up and taking stuff down changed? And why? Sure. So the main thing, the main thing from the last time we talked that has really, truly changed from like 2020 to 2025 is the rise of TikTok. I mean, if you will remember, like in two short years that had basically caught up with 12 years of Facebook's growth. And I mean, TikTok has a different way that they run their content moderation. OK, how so? Well, when we spoke in these past episodes, one of the assumptions of content moderation when it was getting off the ground, be it Facebook or Instagram or YouTube, was that we don't want to censor people unnecessarily. Yep. And so you would keep content up until it was reported as being harmful. And then you would make rules that would limit and try to preserve voice as much as possible as they put it. That was like the industry term for free speech voice. There were limits to that, obviously. But generally, like it was a keep it up unless we have to take it down type of thing. But that's not TikTok. TikTok comes from obviously China and it comes from a censorship, kind of authoritarian CCP culture. And I mean, I believe the Chinese kind of approach to speech is very reflected in the algorithm that TikTok uses. It is not a default. Everyone should see everything. This is a free world and people have a right to say whatever they want, even if it's a private platform. It is a we get to determine what people see and say. And that that's it. So they're just taking tons and tons and tons of stuff down. Oh, I mean, no. Like, TikTok, it prescreens such a volume of content that they determined to not be outside of certain political parameters. And so they're less likely to cause negative interaction effects to put kind of an economic term on it. If I can put a stupid man's term on it, it's like they are choosing to push things up instead of pull things down. That's a perfect way of thinking about it. And they push things up that are very milk toast, very like happy, make you feel good, very apolitical. And so this is basically down ranking or shadow banning. The idea that you're going to manipulate the algorithm to not delete the content, but not promote it. And in addition to that, the algorithm is constantly improving and iterating on all the behavioral signals that you give it. And so it's able to provide a very addictive and expectation meeting. Product. Yeah, product. I mean, there's no way I'm like almost an experience, but I'm like, yeah, kind of, but it's not an, it's, it's, I don't know what it is. I have a confession, which is that I've maybe spent five minutes on TikTok in my life. I don't have TikTok. You don't either. Well, I have like rules for some of these things. Okay. But, you know, I study online speech for a living. So it seems kind of crazy, but I like, I don't need to actually be on TikTok for TikTok to be all over my life. I see TikTok videos constantly. They're cross posted. I don't need to actually be on TikTok. Well, and on that, it is interesting that TikTok figured out how to make banal stuff compelling, because we were certainly told that, well, the reason Facebook wants to leave some of this stuff up is because it's the, it's the highly emotive, highly reactive stuff that keeps people around. So what, what did we have wrong there? Was this just like an adjacent path to the same outcome, which is keeping people on a platform? Oh, I mean, I think that it's actually fascinating. You know, what they figured out is it is a format, a video that people are, are hooked by. And so it does not really matter. You will find yourself often watching things that you didn't know you were interested in, but like you're just compelled by certain types of couples that like look very different from each other, doing any type of like interaction. Fascinating. So it's like Facebook figured out the sort of information that would keep you there. TikTok figured out how to package any information to keep you there. Yes. That's like one way of thinking about it. Like, yeah. I mean, you know, but this is not new. I mean, like advertisers have been doing this forever. Sure. Right. Like this is, you know, it's just a very different business model. It is a very different product model. And it seems to then be a very different informational ecosystem you're creating. Because if you're pushing up everything that falls within certain bounds and you're deciding what those bounds are, it becomes far more like it's controlled at the right word. What's the word? Yeah, it's controlled, but it's also in like a certain way is even more dangerous because like the ultimate in censorship in American First Amendment law is really prior restraint. Wait, sorry. Oh, sorry. Excuse me. What is prior restraint? Prior restraint is censorship before something goes up or is ever published. Oh, so it's not redacted. It's that it was never printed. Exactly. That is the exact distinction. And it's important because the existence of this redaction, the proof that it was removed from Facebook is actually evidence that censorship has happened. Right. Right. Right. Right. Whereas with TikTok, you never even know what you missed. You never even know what you were kept from seeing. And that is really unfortunately what we're staring down at this moment because in the last five years, American social media has moved towards TikTok's approach to content moderation. Well, OK, I didn't expect us to be talking about TikTok so much, but I'm glad we have. So if I'm telling the story of this, it's like once upon a time, Facebook creates content moderation for everything, all these policies, all these rules. Meanwhile, TikTok is sort of lurking across the Pacific, eventually jumps over in Zuckerberg and the Silicon Valley folks. See, they're doing it this very different way. When does that actually start to shift? Not just the way Facebook is thinking about its content moderation, but also maybe the way people are experiencing Facebook as a result. That is not as clear, but the biggest sea change is the one that you're thinking of. Hey, everyone, I want to talk about something important today because it's time to get back to our roots around free expression on Facebook and Instagram, which is the one that happened in January 7th of this year, 2025. Mark Zuckerberg announced the end of the fact checking program. We've reached a point where it's just too many mistakes and too much censorship and that he was going to try to move towards a community notes based system of content moderation. So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms. And I mean, I think that like it was and it wasn't a sea change. OK, well, and talk to me like when we say Facebook got rid of its fact checking, at its sort of height, what was Facebook's fact checking? OK, so not much, which is why this was so really. OK. Which is why this was such a such a frustrating announcement. And it was frustrating that the media focused on it so much. The fact checking was like a commitment to fact checking because there had been so much clamor about mis or disinformation, but they were removing post days after they were flagged and like it was very small. And so to watch it go on the chopping block was really more of a signal to a very particular person and to a very particular party that felt like big tech censorship was coming for them. And like, you know, we can get into a whole kind of conversation about whether or not that was reality based, but that was kind of the complaint. Right. And if I'm going to mount the best defense for conservatives about censorship by big tech, it would be that during the pandemic, there was sort of a party line as to what was an acceptable way to talk about the origins of the pandemic, right? Yeah. And you can even go before the pandemic. OK, you could take it before you can. There's a few things. And one of them was there are serious questions for Joe Biden this evening, following the publication of emails allegedly belonging to his son, Hunter, the Hunter Biden laptop scandal, reporting lays out purported emails between Hunter Biden and Ukrainian businessman. New York Post. They broke the story and links to that were taken off Facebook and Twitter. That was absolutely censored. And what was the justification by Facebook? Well, that was happening a couple of weeks before the 2020 election. And so what had been the huge concern for Facebook and all these other companies was how social media impacted the 2016 election. And so they made a lot of big changes. And one of them was just kind of like, we're not going to allow things that could possibly be for an influence to stay up because this is exactly when we got yelled at in 2016. And so they kind of overcorrected. And I think in hindsight, it was a really hard call. And maybe probably the wrong one. And then you extend that to the Wuhan lab leak. Now, those were just insane, insane issues. And look at us. We're still talking about them today. It's not like they were that censored, unlike going to say China, where it's like, you're like, oh, you know, tank man. And they're like, who? Yeah. Right. Because there are no photos of tank man. Right. They are not published. Right. And so it's not like I just also point taken. OK, well, so then like what has changed then? If, yes, there was some censoring going on and censoring of things in these sort of critical moments, like, would that not happen now? Is that the difference? I mean, I, I, my honest belief, I can't predict the future, but my honest belief is this administration would very quickly put the platforms in line. Yeah, I think that there would be no hesitation to do this, because I don't think that this was ever about free speech. It was about their speech. And that is that is really what you're unfortunately seeing right now. There is no recognizable free speech notions coming out of this current administration and with the tick talk of occasion of social media. People have seen the vector for power that is in content moderation. Hey, Lulu here. And this episode is sponsored by BetterHelp. It is March in like a lion, out like a lamb, and somewhere in the middle. It's International Women's Day. And BetterHelp wants us all to just take a moment to consider the women in our lives, our personal lives, our society and thank them for their strength and for all that they carry. That work matters. They matter. You matter. And therapy offers a space for all of us to take care of ourselves in the way we deserve. Think about the roles you play for the people you love. Think about how those roles intentionally are not way on you and in the worst moments work to weigh you down. Therapy helps create perspective, set healthy boundaries and work toward balance. BetterHelp has loads of therapists, all of whom work according to a strict code of conduct and are fully licensed in the U.S. Why not give it a try? Fill out a short questionnaire and BetterHelp will use their 12 plus years of experience to match you with one. If you aren't happy with your match, switch to a different therapist at any time. Your emotional well-being matters. Find support and feel lighter in therapy. Sign up and get 10% off at betterhelp.com slash radiolab. That's better H-E-L-P dot com slash radiolab. From WQXR and Carnegie Hall comes classical music Happy Out. A new podcast hosted by me, Maniacs. Each episode will speak with a special guest about their lives. Listen to musical gems, answer your classical queries and take part in playful musical games. So grab a drink and press play on a new podcast celebrating our love for all things classical. Listen wherever you get your podcasts. Okay, so Kate, you were saying that TikTok has this fundamentally different approach to content moderation, that instead of reactively taking stuff down, they are proactively flooding the zone with happy making stuff that Facebook and X and others have taken notice and started adopting this approach. And that all this has happened as folks have begun to see that content moderation itself is, I think you said, a vector for power. Yeah, I think that basically what you're seeing is the power over what appears in your feed or doesn't appear in your feed or the types of new content that you're recommended or the first commenters that you see on a video that you just watched. That type of control is an ability that we've never seen before. I remember when I was first writing about this in like 2017, 2018, presenting my research, one of the things that people were so concerned with is filter bubbles. Well, we're going to be in these filter bubbles fed to us by the algorithm. And as it turns out, that was one very true that that would happen. But also even maybe more disturbingly, we don't even need filter bubbles anymore. People are just choosing platforms based on the types of content that they expect to find there. And in that way, so if we were gone from filter bubbles to platform islands, where the owners of the platform get to push up whatever it is that fits whatever their ideological ends are, China and TikTok, it seems to be like milk toast stuff that's not going to rile you up, but it's going to keep your eyeballs on here. It feels a little bit like X, formerly Twitter, is the mirror image where it's like, we're just going to rile you up all the time. Is that right? And is that what we're going to just see more of, which is come to this platform island for emotion A, come to that platform island for emotion B? I think that that's exactly right. I mean, yeah, I mean, that's what we go to the movies for. That's what we turn into like certain types of things for, right? It's I don't know the mood for, you know, a horror film. So I don't go to a horror film. This, this kind of approach is much easier to moderate. People get much less upset and it's much cheaper because there is not as much reactive content moderation to do. You don't have to employ hundreds of people in call centers to review every report of something that's been flagged. And so this has kind of become the new standard. I remember one of the big questions, probably the, in the first piece we did was this question of like, what kind of a space to consider Facebook? Because the First Amendment treats private spaces differently than public spaces. So it matters whether or not Facebook is more like a mall or a public square. And so given all these changes you just mentioned, like what is the metaphor now? I have one based on what you said, but I'm curious what yours is. No, I mean, I've always liked the mall metaphor and it has a weird, squirrely little place in First Amendment law in a bunch of cases. But I want to hear what you kind of want to hear what yours is. Well, to me, it's now or certainly the direction things seem headed based on what you've said is that it's now just, it's just broadcast again. Yeah. And with broadcast, there is no free speech, right? No. Like ABC, NBC, they can cancel a show at any time. They get to decide exactly what the evening lineup is. But with this, with social media, it's like, it's like a broadcast camouflaged as an organically generated thing. 100 percent, you know, you can shadow ban or take down or limit the reach, but it doesn't even have to be that subtle. Like Elon Musk always showing up in my feed, even though I don't follow Elon Musk is like having Rupert Murdoch in like the interstitial spaces before every commercial break at Fox News, you know, like directly telling me what I should think, that isn't subtle. Like that is the other thing about this that is maybe the scariest part of the last couple of months is that none of it even is super pretextual. Like there's not a lot of like excuses. We're not even hiding behind algorithms anymore. It is just the owner of the of the of the platform saying the thing out loud and forcing everyone to see it if they're on his platform. You know, I think that if you're going to all of these different platform islands, the other thing is like, how do we change those to use regulatory regimes to try to control how they speak is obviously a problematic thing by any type of measure. We don't want governments controlling speech for the exact reason of all of the authoritarianism we've just discussed. And so I think that there's it's very hard. Sorry, if I can jump in there, but it does feel like, yeah, I'm not for and have never been for the federal government coming in and molding Facebook's content moderation policies. Of course not. But if something no longer resembles a public square at all and instead has become to keep reusing my my label like a camouflaged broadcasting network where it's like, yeah, these are individuals saying something that they believe in. But then that is being a correlated amassed and pushed out as a opinion changing product by someone on high. I am okay at that point with there being some sort of regulation. It's not regulating maybe what people are allowed to post, but maybe how it's being aggregated. I don't know. There have to be some clever somebody smarter than me who could come up with these sorts of rules. No, I mean, like every Western state has some type of media regulator specifically to avoid maybe like two or three people controlling all of media. Right. But all of a sudden we're like on the internet and yes, there's an infinite amount of content on the internet. But is it so infinite? Like if there are if we're talking about like the same three main places that people are going to for their news, people are going to for like their for their daily interactions, people are going to to feel like they're part of a conversation, their water cooler, their public square, whatever it is. If that is like three people and they're all friends of the president, like that's that's a problem. And maybe even more importantly, journalists, they go to X. They go to Blue Sky. They go to YouTube. They go to TikTok and they report things that are happening in those places as if they're real places that things are happening. But they're also controlled by these individuals and so they're not reflective necessarily of real world, yet they are being reported on as if they were reflective of real world. Right. And so I just think that what you've seen the last five years is an industry understand the power that it holds in content moderation, that it's so not a customer service issue, that it is actually like a huge, huge force for for shaping public opinion. And that that has exponential value to political parties and governments. It's like as valuable as oil and guns, because how you push things, what you keep up, what you take down. I mean, this is how you can basically create, you know, the rise and fall of presidencies if you want to or political parties. And they know how to market them to you, no matter how niche you are. And that's scalable. And so like it's a way to make a lot of money. And then it's a way to control a lot of minds. You know, I think one of the reasons you and I have gotten along so well over the years and have worked so well together in this now trilogy of stories is that we both had sort of an unorthodox approach to this. Um, I mean, most people were saying that these Facebook guys were, were idiots, that they're bad, that they're causing lots of trouble, that we should just like cast scorn upon them. Yeah. And you and then me sort of following your lead were more like, what if we actually try to understand this problem? And I guess now with hindsight, I'm wondering like, did we miss something here? Were we sort of played the fool? Um, you know, it wouldn't be the first time that someone has told me that in some way I'm a useful idiot to Facebook or in some type of capacity. I didn't say, I would say we would be useful idiots. So I didn't call you. I'm asking if we are is the question. I feel as if a lot of people and a lot of what we've said today, people will be like, of course this is what happened. This is what we were saying would happen, but it wasn't feta complete when we talked about it. It wasn't every single one of these solutions has the same flaw at the end of the day to it, which is that these are for-profit companies that do what they want to do and things change, um, as things settle. So I don't know. Okay. Well, so then like his content moderation sort of dead. I just, um, yeah, this is like a, this is like a very controversial thing. Um, it really depends on what you mean by that question. There has been a lot of controversy around, like, are they going to invest in these huge cost centers of trust and safety? Are they going to care about this type of issue? Um, if they can tick, talkify everything and just send you down these rabbit holes of endlessly drooly, like eye glaze over like wall-y kind of scene where you're on the couch with your slurpee, like Barkalounder or whatever, like watching things, is that what they're, they're basically going to do and are they going to have to keep moderating? And I mean, I think that like the answer is that we're going to increasingly see a automated content moderation system. It's going to increasingly not embody the edges of society and, the range of voice that we had at the beginnings of the internet and that we are going to kind of see a productification of speech. Yeah. I'd love to give you one, one, one more idea that I've been playing around with for a couple of years. Yeah. If I was ever going to write a short sci-fi story, it would be about the quote unquote, perfect piece of art. You step in front of it, it does a quick facial scan of you, pulls everything about you that it knows from the internet, and then it puts forward an image perfectly generated for you that will evoke a feeling. On Tuesdays, it's happiness. On Wednesdays, it's sadness. And so it's this visual tableau personalized to every person that evokes the same emotion. And once you have that, once you can control the emotions of people, with the flip of a dial by putting something in front of them that are going to only peak that feeling for them, then you could just control everybody. Well, I love that. Sounds like a Ted Chang story. That is like, but that's, you know, you should write that. Maybe you can ask AI to do it for you if you're really busy. This story was reported and produced by me, Simon Adler, with some original music and sound design by me, mixing done by Jeremy Bloom. Of course, huge, huge thank you to Kate Klonic as always. And yeah, we will be back next week saying some more things. Until then, thanks for listening. I think we're using this one. Hello. Oh, I can hear myself. Kid, podcast, crossover special. Hi, I'm from. Wait, hi, I'm Noor Sultan and I'm from New York. And here are the staff credits. We are the lab was created by Jad Abumbrad and is edited by Sir Whither. Dylan Keith is our director of sound design. Lulu Miller and Latif Masser are our co-hosts. Our staff includes Simon Adler, Jeremy Bloom, W. Harry Fortuna and David Gable. Oh, so I just have to read that one name. OK. Oh, my God. Send do nice somebody. Yes. Annie McEwen, Alex Neeson and Sarah Kari. Oh, Sarah Sandback, Anissa V. Tez, Ariane Wack, Pat Wolters, Molly Webster and Jessica Young. Yeah, yeah, I see it. Do I sound like happy? With help from Rebecca Rand, our fact checkers are Diane Kelly, Emily Krieger, Anna Paul, Mazzini and Natalie Middleton. Guys, I know I'm famous. You don't got to clap. It's all. Hi, this is Laura calling from Cleveland, Ohio. Leadership support for Radio Lab Science Programming is provided by the Simmons Foundation and the John Templeton Foundation. Foundational support for Radio Lab was provided by the Alfred P. Sloan Foundation. WNYC's journalism and storytelling is heard by millions of passionate listeners. Sponsors of our programming gain our listeners' attention and their respect. Learn about how your organization can support WNYC and WNYC studios at sponsorship.wnyc.org.