Taylor Lorenz’s Power User

Congress Is Pushing Another Internet Censorship Law: The SCREEN Act

31 min
Jan 9, 20263 months ago
Listen to Episode
Summary

The episode examines the SCREEN Act, a federal bill framed as child safety legislation that would mandate age and identity verification for accessing online content. Host Taylor Lorenz and guest Mike Stable from the Free Speech Coalition discuss how the bill functions as a Trojan horse for mass surveillance and censorship, with particular concern about its bipartisan support and implications for marginalized communities.

Insights
  • Age verification laws are designed as surveillance infrastructure that will inevitably expand beyond their stated purpose, following the historical pattern of security legislation like the Patriot Act
  • Compliance costs for age verification ($0.50 per user) disproportionately harm independent creators and small websites while large tech platforms can absorb expenses, creating market consolidation
  • The vague definition of 'material harmful to minors' is being weaponized to censor LGBTQ content, reproductive health information, and political speech across multiple sectors including libraries and ISPs
  • Democratic support for these bills stems from political cowardice rather than genuine policy disagreement, enabling right-wing censorship infrastructure while appearing to protect children
  • VPN bans and domain seizures represent the logical next step in this regulatory escalation, demonstrating how incremental surveillance laws create precedent for more authoritarian measures
Trends
Coordinated federal and state-level legislation using child safety framing to implement mass surveillance and content filtering infrastructureReligious right and Heritage Foundation organizations driving internet censorship policy across multiple jurisdictions with bipartisan political supportAge verification industry emergence as profit-driven sector actively lobbying for expansive legislation to increase market demandDifferential enforcement of content restrictions favoring right-wing platforms (X, Truth Social, Rumble) while targeting progressive and marginalized community platformsExpansion of 'harmful to minors' designation beyond adult content to healthcare information, LGBTQ resources, and political speech in libraries and ISP regulationsInternational regulatory precedent (UK Online Safety Act, Australian social media ban) being cited to justify increasingly restrictive US legislationData retention requirements creating permanent surveillance records despite claims of expedited deletion, enabling government and private sector exploitationVPN restrictions and ISP-level content filtering emerging as next phase of internet control infrastructureLiability frameworks designed to incentivize corporate self-censorship through vague legal standards and threat of litigationBiometric data collection normalization through facial scanning requirements for routine internet access
Topics
Companies
Meta (Facebook)
Discussed as major platform subject to age verification requirements under SCREEN Act and similar legislation
X (formerly Twitter)
Noted as right-wing platform that has not implemented age verification despite similar laws, contrasting with Blue Sky
Blue Sky
Left-leaning platform that pulled out of Mississippi and implemented age verification in multiple states due to liabi...
Reddit
Social media platform that would be subject to age verification requirements under Senate version of SCREEN Act
Netflix
Streaming service mentioned as platform containing material potentially classified as harmful to minors under SCREEN Act
YouTube
Video platform discussed as subject to age verification requirements under proposed legislation
The New York Times
News outlet used as example of mainstream platform that would require age verification under proposed laws
Truth Social
Right-wing platform noted as exempt from content restrictions that apply to mainstream social media platforms
Rumble
Alternative video platform noted as exempt from age verification and content restriction requirements
Federal Trade Commission (FTC)
Government agency tasked with enforcing SCREEN Act compliance and pursuing deceptive practice violations
People
Mike Stable
Director of Public Policy at Free Speech Coalition, primary guest discussing SCREEN Act implications and state-level ...
Senator Mike Lee
Utah senator who introduced the SCREEN Act in the Senate; associated with Heritage Foundation censorship agenda
Representative Mary Miller
House representative who introduced the House version of the SCREEN Act
Neera Tanden
Biden administration official who advocated for eliminating internet anonymity at content creator summit
Eric Goldman
Legal expert previously featured on the podcast discussing age verification data retention requirements
Quotes
"You're paying sometimes 50 cents a dollar for every person who comes to your site. It's devastating."
Mike StableOpening segment
"If every time you went to a site, if every time you went to the New York Times, you had to scan your face and upload an ID and send that in, you'd stop going."
Mike StableMid-episode
"These are just mass privacy and surveillance bills."
Taylor LorenzMid-episode
"Authoritarianism never happens all at once, right? Like saying that you want to ban VPNs would have been unthinkable five years ago."
Taylor LorenzLate episode
"The more right-wing power and control on the internet is cemented in place."
Taylor LorenzClosing segment
Full Transcript
You're paying sometimes 50 cents a dollar for every person who comes to your site. It's devastating. Every week, it feels like there's a new atrocious internet law that Congress has cooked up. And this week, we're diving deep into the Screen Act. Framed as a child safety bill, this law is an insidious Trojan horse for mass censorship and surveillance. It's been getting a lot of traction thanks to its backing from extreme far-right groups and the religious right. And now Democrats are signing on board, claiming that it'll help protect kids online. In reality, this law could end up forcing you to scan your face to use websites, ban all VPNs, and a lot more. Mike Stable is director of public policy at the Free Speech Coalition, an organization dedicated to protecting the rights and freedoms of workers in the adult industry. Mike has been fighting these censorship and surveillance laws on a state level and has been in the room at a lot of these state legislatures where these debates are taking place. Today, we're going to get really nitty gritty into what the Screen Act would actually do, what the bill says, how this law came to fruition, what sets it apart from some of these other really bad child safety laws, and why we need to band together to kill it. Mike, welcome to Free Speech Friday. Thanks, Taylor. I'm glad to be here. All right. So to start off, can you tell me where did this Screen Act originate from? Where did this piece of legislation like come out of? So Screen Act came from Senator Mike Lee, who is a senator from Utah. You might have noticed a lot of bills coming out of Utah around censorship, not just about adult content, but of LGBTQ content, social media content. There's a lot of stuff happening at the state level and at the federal level, obviously, with Senator Lee. So that's a sort of short answer as to where it came from. I think in a broader sense, it comes from the Heritage Foundation and groups like it that have looked for ways to repress things on the internet. Despite coming from these more right-wing places, Utah, as I say all the time on this channel, is always at the scene of the crime. We hate Utah's tech laws. They're so bad. But, you know, as you mentioned, I think a lot of these censorship efforts are also part of Project 2025. And yet, you know, the Screen Act is also being championed by Democrats in Congress. So can you explain a little bit about what does the Screen Act do? How does it work? So the Screen Act requires age verification for anyone going to loosely will define it as an adult site. There are different versions of this bill in the House. It's more specifically adult sites or sites with material harmful to minors, which is a sort of broad category, at least as envisioned by conservatives. On the Senate side, it's any site that has material harmful to minors. Right. So that would include things like X that would include Blue Sky, that would include Reddit, that would include Facebook, that would include Netflix. Right. Anything that is inappropriate for a minor to see. What it requires is that if you go to those sites, that those sites need to perform some sort of age verification, right? And that may be scanning your face. That may be uploading an ID. In practice, that means both of them, what we've seen from these bills at the state level. And it also has restrictions on the use of VPNs. So it's a general bill, I think, that we see, as you mentioned, right, these are Republican bills. They generally have come out of conservative states and from conservative organizations. We have seen some, you know, at the state level, at least Democrats get on board with some of these because they're very difficult to vote against politically. I've had a lot of I think we've got probably 22, 23 bills that have been passed at the state level that are similar to the Screen Act. And I've been to a lot of these states. I've met with a lot of these legislators and often they are totally opposed, both on the Republican side and the Democratic side. I've had lots of conversations with people who are opposed to these bills for various reasons, but find them hard to vote against because they make for an easy attack ad. If you're saying, well, I don't think that I should protect children from adult content, I think it's a hard vote to make. And I think they're designed that way. I mean, so much of it reminds me of a very millennial thing to say, but of the 2000s with like terrorism laws and things like the Patriot Act, where it was like, oh, what do you want to do? Not protect America? Oh, you're not a patriot? Oh, you don't you think the terrorists should be able to do X, Y, Z. And it's like, no, we just don't want these invasive surveillance laws that are ultimately going to harm innocent Americans. It seems like there's so many different forms of age verification. I hate the term age verification to begin with. I feel like we should just say identity verification because that's ultimately what these things do. But can you kind of explain why this is such a problem? Why is it bad if, you know, we roll out mass age verification on social media? Hopefully anybody listening to this podcast knows, but if you can just give a little bit of information about the harm that the Screen Act could cause if it was enacted. Sure. So I think that what we see when these bills are presented and you go through the testimony of the advocates is that these bills are really easy to comply with. Right. And it presents no burden for adults. Right. That generally has been the standard for constitutionality. You know, is there a burden for adults? Is there a significant burden? Does this function as censorship? right? If I go to a news site or YouTube or, you know, an adult site, is it going to stop adults from doing this because it's too burdensome, right? Because somebody has to scan their face and in practice that is true. But what they will tell you and what they will testify is that there's no burden whatsoever. This presents no problems. I was reading testimony from one of the people, the FTC, who's sort of been an advocate for these bills from a couple of years ago. And he said, you know, you wouldn't even notice the technology is so good. You wouldn't even notice that it was happening, right? That's even more scary. Yeah, it is. I mean, the technological proficiency of some of the advocates of these bills and some of the legislators is quite frightening. But, you know, in practice, what happens is you go to an adult site or someone goes to an adult site, they are asked to scan their face and to take a picture of their ID and send that ID to a third party. I will tell you, to no surprise, nobody wants to do that. They don't want to do it to access Blue Sky. They don't want to do it to access the New York Times. They don't want to to access YouTube, but they especially don't want to do it when it comes to accessing an adult site, right? And so there is a huge chilling effect, right? So what we see is around 95% of people when they're confronted with this, they hit the back button and they go to a site that is not located in the US and is not obeying US law or sometimes even international law or they just get a VPN and go around it. I think that people are very wary of submitting that information because one, they worry how it's going to be used. Is this information going to be kept secure? Is it going to be intercepted? Is it going to be used against them? Are they going to be victims of identity theft? Are they going to be victims of extortion, right? From somebody saying, hey, listen, I know you went to this website, send me X amount in Bitcoin or I'm going to divulge it. And also I think that, and this is sort of the slept on concern, it's a pain in the neck. If every time you went to a site, if every time you went to the New York Times, you had to scan your face and upload an ID and send that in, you'd stop going. You'd find some other place that had it because you're not going to do it. And people are accessing these sites on their phones. Right. So they're not in a position where, you know, and sometimes one handedly, to be frank. Right. It's not in a position where you're able to take, you know, a selfie and you're like people don't want to do it. Not everyone has a driver's license, by the way. I mean, not everyone even has their documentation. I mean, it's just an absurd premise that shuts out a lot of vulnerable people from the Internet, too. and access to resources and information, much of which comes from social media platforms because the entire information ecosystem has been reoriented around social media. And now we're going to make that information completely inaccessible to anybody that doesn't have their like government ID ready to go. And like you said, able to scan their face indefinitely, repeatedly, every time they want to, you know, access one of these platforms. Or they don't have a webcam. I mean, at Free Speech Coalition, we get people sending us they confuse They think that we maybe in charge of the sites because sometimes the sites will reroute their traffic to us to say if you want to understand what happening here visit Free Speech Coalition So we get people who will email us their ID a picture of their photo ID Or we'll get people who will say, hey, listen, I don't have a webcam on my laptop, right? I can't do this. I've spoken with one guy in Tennessee who said, listen, I have a visual impairment. It takes me so long and I don't have steady Internet. So it is difficult for me to do this before I'm logged out. In Tennessee, you have to do it every hour. You have to verify, re-verify every hour. Every hour? Like, it's crazy. And I mean, so many states, as you mentioned, and I'm curious kind of what it's been like, are rolling out these types of restrictive laws. I mean, I saw also Virginia is trying to, like, block kids from accessing social media during certain times. They're only going to be allowed to use it. Like, again, this is the government coming in and telling them, like, when they can access certain websites. so dystopian. How have these laws been able to get through? I mean, aside from the fact that obviously, yeah, no politician wants to have a backbone and stand up and say, yeah, actually, I'm against the quote unquote child safety laws. But what other arguments are they making that are compelling or like what interest groups are out there pushing this stuff forward? Because it seems like suddenly, nearly overnight, it's gotten so much traction. I will be frank. Most of the people who are pushing these bills are faith based groups, right, as you would imagine. And I think that tracks certainly with Utah, but in the states that we have seen this come in, these sort of started in places like Louisiana and Utah and Texas, where the bill was written in conjunction with the sponsors, make a church pastor, right? These are bills that are really seen as a backdoor censorship, right? They understand that there are limits on the First Amendment, that there are protections for adult content in the First Amendment that are difficult to get around. And so what they've said is, if we're going to lose on First Amendment grounds, what we need to do is create as many restrictions as possible so that people don't access this site. And they're not talking about just kids, right? They are talking about adults. And we hear this over and over again from legislators when it passes and an adult site maybe pulls out of a state or blocks it. Blue Sky was, I think, pulled out of Mississippi for a time actually over some of these laws. Yeah, because the liability is so vague. Anybody can bring you into court in a lot of these states, right? So a parent who says, hey, listen, my kid just told me they're trans, and I went on their browser history, and I saw that they were visiting this site that had trans information on it. I'm going to bring you into court, right? And as soon as somebody sues you, you lose, right? It's going to be hundreds of thousands of dollars. And so a lot of people just sort of shut down instead, right? Because especially at the corporate level, they say, well, we're going to comply in advance, right? Like, yes, this is wrong, but corporate counsel says we don't want a lawsuit, or we don't want multiple lawsuits, or we don't want the attention of the federal government. So what we're going to do is we're either going to pull out of a state or we're going to remove content that might be an issue. I talked to a healthcare site a few weeks ago, and they have shut down because they are afraid of their liability under these state laws, right? Because a parent might say, hey, I don't think that this information on reproductive rights, or I don't think this information on consent is appropriate for a minor. I think it's harmful for a minor. I think it sexualizes minors. I'm going to take suit against you. They didn't face any of these suits, but their board basically said, you know what, I'm not comfortable with this. You know, there's a liability here. And I think that that's the censorship effect we see. So, you know, one, it's a censorship because people don't want to do it. And so they don't go to those sites. The other is that sites that might have material like this, like Blue Sky, like Reddit, start taking it down and start censoring it because their lawyers say it's just easier to do this. Yeah, these are vulnerable communities. Yes, this is valid means of expression, but I don't want a lawsuit or I don't want the FTC coming after me in the case of the Screen Act. Yeah. And as we've covered before as well, it's like these tech companies, their job is not to protect free expression for trans people. Their job is to make as much money as possible. And they can only do that in a friendly regulatory environment. And so I think they're, yeah, they mass comply. I think what's so scary is, as you mentioned, all of these state laws are happening. A lot of these platforms and websites operate on a federal level. Like, I mean, there's only one Facebook app, right, that you can download in America. And so they will just go ahead and default to the most restrictive state laws sometimes in terms of the content that they'll recommend to people or show or allow to be discovered through search. Yeah, I mean, I think there's also sort of a marked difference in terms of consumer behavior around these sites, because like you said, there is only one Facebook, right? So people may go through those efforts to, hey, I'm going to verify and I trust Facebook and they've got my information because I've got no other options, as sad as that is. With adult sites, there are millions of sites globally, right, with this type of content. And so if they go to a specific site and that site says, well, we're trying to comply with Kansas law or Mississippi law or whatever it is, we need you to scan your face and upload your ID. They hit the back button and they find some place that isn't, that is based out of Cyprus or Romania or Russia. Right. That's even more unregulated and full of bad, the bad stuff that they claim to want to protect kids against. One thing I think about is I wrote this piece for the Guardian after the Online Safety Act went into effect in the UK about how many addiction forums were taken down and actually forums and websites where young people could learn about SA or things happening to them, you know, adults taking advantage of them, learning what those inappropriate power dynamics were. And basically they were less able to report abuse because these sites and platforms and communities were being taken away from them. And so I think it's actually, yeah, I mean, it's causing immense harm to children in those ways. I'm curious, like with The Screen Act specifically, it feels like there's so many of these different laws. There's so many of these different legal efforts. Can you talk about the status of the Screen Act in Congress and who's involved with it and what makes it a little bit different from something just like the Kids Online Safety Act or some of these dozen of other kind of similar laws? So the Screen Act is introduced in the Senate by Mike Lee. In the House, it was introduced by Representative Mary Miller. It hasn't progressed very much in the Senate. It hasn't moved at all. It was the subject of a subcommittee hearing, along with, I think, 19 other bills just before the holiday, where they sort of unanimously pass them, pass it out of the subcommittee. We don't know yet sort of what the feeling is on the ground with a national bill. I think there are concerns it preempts a lot of the state laws. There are differences between the Senate version, which applies to any site with any amount of material harmful to minors, and the House bill, which is more specifically targeted at, quote unquote, adult sites. So we don't know what that sort of reconciliation looks like, much like we look at with COSA. I think the difference for the Screen Act is that COSA says that you don't have to age verify, right? That's their claim. They don't explain how they're supposed to do it, but the Screen Act does. The Screen Act says you have to sort of collect this information. You need to delete it expeditiously, but you have to do it. And if not, the FTC is going to come after you for deceptive practices. It doesn't say you need to upload an ID, but it says you need to come up with a system for verifying identity to make sure that minors are not accessing any adult content on your site. And in practice, that's going to be obviously ID and face scanning. Yeah. When I think of the Screen Act, I guess, or when I've tried to explain it to people, I think that it's a little bit more prescriptive in terms of how it's trying to mandate age and identity verification. And it's saying like a little bit more about the specific ways you have to go about doing that, whereas COSA is a little bit more broad. And I think what's so insidious about the Screen Act is it's also more centered towards quote-unquote adult sites. And COSA is just more broad and seems to be more focused on social media. I think however when we see how these laws play out and why I think it so important to defend adult sites and defend access to them and fight against all forms of this stuff is because I mean you mentioned the Senate version of the Screen Act right now It claims that sort of any site that hosts content that could be quote unquote harmful to minors has to enact these restrictions And content that is harmful for minors is a deeply subjective delineation. And there's no like sort of common agreed upon definition of what that is. And as we've seen, at least the way that these laws have played out in other country, content that's deemed harmful for minors includes things like, you know, information about Israeli war crimes or things like that, aside from just LGBTQ stuff and reproductive justice stuff, I think a lot of other websites end up being censored under these like adult or inappropriate for minors sort of like content bans. Yeah. And in the US, it's definitely more specific. I think that the same people that are pushing these laws, the same groups that have been pushing these laws on the online side are at the library level using harmful to minors to pull LGBTQ books out. They're using harmful to minors to pull books about SA out. They're using harmful to minors as a designation to pull resources out, right? Healthcare resources, anything that might teach kids about their bodies or puberty or anything like that, that is deemed as adult content by these parents or by these legislators. And I think that you have to understand, you have to look at the right hand is making all of this noise about online safety. The left hand with these legislators is pushing forward all of these restrictions at the library level where, you know, the right hand says, well, it's just about adult content. At the same time, the same groups are saying we're designating all of this other content as adult. And it's a bit of a Trojan horse, to be honest. I think another sort of Trojan horse argument that I've seen among the left is that this is cracking down on big tech or that this is somehow sort of regulating tech companies. And that infuriates me. I was fighting with somebody on Blue Sky earlier this week, literally, where they were backing these bills and saying, well, you know, we have to crack down on big tech and you're a big tech shill if you say that, you know, we shouldn't have this specific regulation, which is very silly, I think, to say of certainly me, I think either of us, where we constantly report critically on big tech. But not only are a lot of these websites actually that are targeted independent, smaller websites, like this is going to have devastating consequences for smaller websites. But can you kind of talk about that argument? Because I hear it so much from the left. I think so many leftists have literally gotten on board with this like Heritage Foundation agenda, because they're falling for this propaganda that it's somehow, you know, cracking down on tech. I think that specifically with Section 230, we hear that all the time. And what people don't always realize about these laws is they're really difficult to comply with. So a large site, whether it's an adult site or a site like Facebook, has some resources to institute these protocols. It may not be what they want to do, but they have the lawyers to design them. They've got the resources to pay for them. They can find ways to make it work. When you're talking about an independent site, you know, if you're talking about an adult site, a site that is run maybe by a performer or some sort of mom and pop site or something that is dealing with LGBTQ issues in art that is a hobby. They don't have the money to pay, right? People don't realize about age verification is expensive. You're paying sometimes 50 cents a dollar for every person who comes to your site. That's, it's devastating. And then to house all the data and then to house the data in an ethical way, right? Where it's not just going to be hacked and scanned. Yeah. And to delete it and make sure there's all of these, these laws are nothing but tripwires and they're tripwires for liability. And when I look at the Screen Act and you go through and try to figure out what's different here than state laws or what should adult sites be worried about or what should larger, you know, other sites that are not maybe having adult content, but might have content that a conservative FTC might determine is harmful to minors or trans healthcare, that sort of stuff. What do they need to do? And you're really left with not a lot of good solutions, right? If you do comply and you perform the age verification, there are so many restrictions around what happens with that data and how do you have to delete it? And how do you prove that it's deleted? And in some states, there are things where you have to delete the data, but you also have to prove that you complied with the law, right? And that you performed it on this person. So you can't delete the data. You can't delete the data. You can't delete the data because you have to prove if you're sued that you did actually verify that person's age. That drives me crazy. And I had Eric Goldman, who's iconic, who's on here to talk about age verification. And he talked about that so much of like this idea that, oh, well, the data is just seen and deleted is just simply never true because you have to retain that data in case you get sued or in case the government comes and says, hey, you didn't verify these people. You have to show proof that you did. And that means retaining all of that data. And it's a new industry, right? There's not a lot that people know about age verification. These companies have sprung up like mushrooms after a rainstorm in the wake of these laws. In a lot of cases, they pushed forward these laws. And we see these groups testifying. These are for-profit businesses in a for-profit industry that is looking to apply this to as much content online as possible, right? They partner with the religious right groups. They partner with the people who want to restrict the internet because it's in their interest. And I don't think there's a lot of attention that's given to them. But we don't know a lot about them. There aren't a lot of audits. There aren't a lot of programs to figure out what's actually happening with that data. When I testify at state legislators, there are people from the age verification community who will come up and testify or I'll hear legislators repeat it. Oh, it just disappears. Or, oh, you can do it with your hand. Or, oh, you know, there's all of these different solutions and nobody has to worry about privacy. And it's just not true. I mean, in addition to the fact that we don't have any proof of any of this and that there's always going to be new people who pop up that don't follow protocols and keep that data. There's also obviously always the version of surveillance. Even if you're sending it and that data is being deleted, there's the opportunity for someone to, if you're not using a VPN or even if you are to come in and take that data. It's very sensitive. You shouldn't be sending your biometrics over the Internet. Especially when it's tying. I think what's so insidious, too, is that we see how the government is seeking to leverage people's browser history, Internet search, you know, information, websites they visited, all of that against them by the Department of Homeland Security and other government agencies. We've seen in the UK accusing people of terrorism for the content that they've consumed or shared or engaged with somehow online. And I think that that's terrifying. We should all value our privacy. And at the end of the day, these are just mass privacy and surveillance bills. And, you know, you mentioned VPNs, and I think that's how a lot of people have been getting around these laws temporarily. But it seems like there's increasingly a quest to ban VPNs. I mean, people are talking about VPN bans now. I can't remember what state it was the other day that just said that they want to ban VPNs. And I think you're even hearing, remember, you know, people in the EU talk about this more. It seems like it's more and more on the table. And I think that that's the ultimate goal, honestly. I've heard advocates for these bills, including people who are now at the FTC, say early on, we know these laws don't work. We know that they're not going to be effective. And that's when there's going to be other options that we need to come in and consider. And so I think that what they've done is they've laid a trap. They've said, oh, here are these laws that will solve this problem. Well, this problem's not solved. So now we have to do this other thing. We've gotten you buy-in that you require the surveillance online. But now we're going to need an additional point of surveillance that maybe you wouldn't have agreed to before. So we see in Michigan, they asked for a full ban on VPNs. In Wisconsin, there is a bill pending that effectively bans people from using VPNs to go to adult sites. It bans adult sites from accepting VPN traffic. And I think that even in the Screen Act, it says that anybody who comes through a VPN has to be age verified. So I think that you're looking at that sort of legislation. I think you're also looking at a move beyond that in some cases where they proposed, listen, these laws aren't going to work. And then we're going to have to start seizing domains. Right. We're going to use the DOJ to go after it. So I think that authoritarianism sort of comes step by step. And I think that this when you going after adult content when you going after material harmful to minors and we need to protect kids online it easy to get people involved And then once they done that you say well you already in agreement with this We just have to do this other little thing We just need to go a little bit further And slowly those rights emerge. And I think that you talked about terrorism and what happened in the advent of the Patriot Act and what we're seeing today and the ways in which those powers that we were worried about then are now being exploited any way that you could imagine. Yeah, this is the same thing. We talk about like, you know, the ways in which all the public cameras, the license plate readers, all the things that they just, oh, it's just about catching people with traffic that they're now using to catch immigrants, right? They're asking with these bills that you have a camera inside your home, that anybody who goes on the internet has to undergo surveillance in order to access it. It's not going to stop with adult sites. It's not going to stop with material harmful to minors. They want to make it so that in order for you to access the internet, you have to give up anonymity. Yes, which is what Neera Tanden, the Biden administration official, was advocating for on stage last year at their content creator summit. And the Biden administration and these Democrats continue to push so hard for it's terrifying. In terms of just this slippery slope, I think of what happened in Australia recently where they, quote unquote, banned social media for, you know, kids under the age of 16. By the way, there's no difference between like age 13 and 16, you know, accessing their like these are all just made up laws. And we could go down another rabbit hole of like the fact that there is no documented harm here, by the way, to any of this content. Like all of that is pseudoscience bullshit. So they enacted this law. And now you're seeing just this week on Australia's biggest news morning show saying, you know, we need to go further and ban cell phones. A lot of the surveillance needs to happen at the cell phone level. And we need to prevent anyone under the age of 16 or 18 from even accessing a cell phone or a computer. You know, it just, it goes further and further and further because like you said, their goal, if you listen to the Project 2025 and Heritage Foundation people that are out saying it, like their goal is authoritarianism and mass censorship. And these liberals, especially the liberals and leftists drive me the craziest because I'm like, you're supposed to be against authoritarianism. Like they say, oh, well, yeah, we should protect minors online, quote unquote, right? We do want to crack down a big tech. So yeah, we do have to go along with this stuff. And as you said, it's a slippery slope. Authoritarianism never happens all at once, right? Like saying that you want to ban VPNs would have been unthinkable five years ago. And now it's suddenly on the table in multiple regions. Like that's terrifying in itself. And I think also shows the United States government's desire to like control access to the Internet in America, like the way that they're trying to make a separate TikTok America and make it harder even for us Americans to get information from the outside world or perspectives from the outside world. And to position these as binary issues. They want it in a binary way where you can't disagree with them. And I think with this, it's, oh, you think that children should be harmed. You think that children shouldn't be protected. And once they get you to agree to one of those propositions, it's easier for them to say, well, this is along the same lines. And I think that authoritarian doesn't happen all at once, although this year was pretty quick. But a lot of the groundwork had been previously laid. And I think that what I try to communicate to legislators on the left is that we have an administration that is open about wanting to exploit absolutely every loophole that they have. right? If there is vagary in a law, they're going to take advantage of it. And to say, well, this is just going to apply to adult sites, or this is just going to apply to really harmful content. There's no reason to believe that. The people behind these laws have defined transgender ideology as harmful to minors, right? They have been explicit about that they understand all of these things to be the same thing, and that they're going to go after them the same way, that they think the ISPs should be shut down if they are allowed this. I mean, I was looking at a law in South Carolina that was just introduced that requires ISPs to block material harmful to minors. Which is like healthcare information, by the way, like basic information about human rights. It's scary. And at the same time, by the way, they're putting like PragerU videos in schools in Oklahoma or whatever, you know. All this is about liability, as we see at the level of this administration is they're going to go after people who disagree with them. And so when you look at ISPs, if you think about South Carolina with an ISP ban for material harmful to minors. They're going to let sites that are friendly to an administration go through, and they're going to let sites that are not, not. If you look at Blue Sky pulling out of Mississippi or Blue Sky activating age verification in Ohio or South Dakota or Wyoming, where there's lower levels of this, they've done it in a number of states. X hasn't done that. X is not afraid of getting sued. They're not afraid of a conservative attorney general, but Blue Sky is. And so a site that is sort of left-leaning is going to be boxed out, you know, and limited access. And a site that is right-leaning is going to have all the access they want, despite the fact that, as you know, GROC is, you know, producing material harmful to miners. Very much material harmful to miners. Yeah. What scares me so much is if these laws pass, not just the Screen Act, the more of these restrictive laws that pass, I guess, because there's so many at this point, it's such a deluge, the more right-wing power and control on the internet is cemented in place. As you mentioned, none of these laws regulating social media seem to apply to X. They don't apply to truth social. They don't apply to these far-right places. They don't apply to rumble, right? But they apply to mainstream social media where you can get liberal opinions and progressive leftist websites and social media and any information about sort of marginalized groups. And I think that's what's so scary. And I I think that's also why it's so infuriating to see so many leftists get on board with this crusade. I think that when you're talking about adult content or when you're talking about technology, legislators don't have a lot of familiarity or they feign familiarity. They feign lack of familiarity, especially with adult content and adult websites. But it's easier for them just to go with the flow. It's easier for them to say, I've got an election coming up. This is a difficult issue. Everybody agrees. You quote unquote, everybody agrees that we need to do something about this. I'm going to do something about this. And if a marginalized population gets hurt, well, they're not a big population anyway. Right. They're not they're not big voters anyhow. And I think that that's incredibly dangerous. And that's what we're seeing across the board. And I think that whether it's Screen Act or COSA, there are all of these bills that are put out there that are easy for people to say, well, I'll go along with this because the downside is bad and it's easy to go along with it. Not realizing that, you know, there's poison in there. I think the lawmakers know there's poison. I think they don't care. Maybe some people are going along with it in ignorance. I think they're for my own mental health. I have to pretend that there's some sort of rational world out there. that people are fighting for this, but I think that the more cynical view is probably the more accurate one. Mike, thank you so much for joining me today and talking about all this stuff. Taylor, I wish there were better news and I wish that we were talking about bills that would actually help people, but thanks for having me on. That's it for this week's episode. Please support my work via Patreon on the link below. You can also buy a paid subscription to my tech and online culture newsletter at usermag.co. That's usermag.co. Or you can get my bi-weekly newsletter. It's a roundup of everything that I'm reading and seeing online on Patreon. Again, the link is right below on Patreon. I also post bonus episodes of power user. I host a monthly Q and a live stream, and I post lots of updates about my life and my work on there. As I've said before, talking about these topics makes it extremely, extremely hard to get any sort of advertising support. And I have no long-term brand partnerships. I am a hundred percent independent. I'm not funded by any weird outside political organizations or dark money groups or anything like that. The only reason I can continue to do the work that I'm doing is thanks to the support from people like you. Literally every single dollar makes such a huge difference. So if you can please support me on Patreon or sub stack via the links below, seriously, it makes such a difference. Thank you so much. And I'll be back next week with a brand new episode of free speech Friday. See you then.