Tech Won't Save Us

How Effective is Australia’s Social Media Age Limit? w/ Cam Wilson

59 min
Dec 18, 20254 months ago
Listen to Episode
Summary

Australia's under-16 social media age limit took effect in December 2024, but early implementation reveals significant circumvention and questions about effectiveness. The episode examines how the policy emerged from media campaigns rather than expert consensus, what regulatory opportunities were missed, and what lessons other jurisdictions should learn before following Australia's lead.

Insights
  • The age limit legislation originated from media-driven populist campaigns rather than evidence-based policy development, with News Corp and other outlets using child safety concerns to advance commercial interests amid stalled news bargaining code negotiations.
  • Implementation relies on a 'waterfall' of age verification methods (data inference, facial scanning, ID verification) that are easily circumvented, yet the government has quietly lowered success metrics from preventing harm to merely 'starting conversations' about social media.
  • The legislation removed a critical 'exemption framework' that would have incentivized platforms to create safer, curated versions of apps for young users—a missed opportunity to regulate platform features rather than impose a blanket ban.
  • Other concurrent regulatory efforts (Online Safety Act, children's privacy codes, algorithmic feed reforms) were overshadowed and potentially undermined by the age limit campaign, representing a significant opportunity cost in tech policy.
  • Digital sovereignty concerns regarding US data access and the Trump administration's potential retaliation have created political hesitation around further tech regulation in Australia, despite bipartisan defense alignment with the US.
Trends
Global jurisdictions adopting age-based social media bans following Australia's lead, but with variations (EU proposing 15+ with parental consent options)Shift from evidence-based tech regulation toward populist policy-making driven by media campaigns and parental anxiety narrativesAge verification technology (facial recognition, biometric scanning) becoming standard enforcement mechanism despite privacy concerns and circumvention vulnerabilitiesRegulatory focus narrowing to age restrictions while broader platform harms (algorithmic amplification, predatory features, data collection) remain unaddressedGeopolitical tech regulation tensions emerging as US administration threatens retaliation against countries implementing independent tech policiesChatbot regulation emerging as parallel concern, with age verification and content restrictions now required in Australia starting March 2025Digital sovereignty discussions gaining corporate attention despite political silence, particularly around US government data access and jurisdictionRegulatory opportunity cost: simpler, more politically saleable bans displacing more complex but potentially more effective feature-based regulation
Topics
Social Media Age Limit Legislation (Australia)Age Verification Technology and Biometric ScanningPlatform Circumvention Methods and Enforcement GapsNews Media Bargaining Code and Commercial InterestsAlgorithmic Amplification and Platform Features RegulationChildren's Online Privacy and Data CollectionDigital Sovereignty and US Data AccessChatbot Regulation and Age VerificationRegulatory Opportunity Cost AnalysisMedia Campaign Influence on Tech PolicyTrump Administration Tech Retaliation ThreatsContent Moderation and Online Safety FrameworksParental Consent Models vs. Blanket BansVPN and Platform Circumvention AdoptionInternational Tech Regulation Harmonization
Companies
Meta
Major platform subject to age limit ban; withdrew from Australian news bargaining code negotiations; developing age v...
TikTok
Listed as primary social media platform covered by age limit; users can still access via logged-out state with algori...
YouTube
Controversial inclusion in age limit legislation; remains accessible in logged-out state, undermining ban effectiveness
Facebook
Subject to age limit ban; part of Meta's portfolio; historically failed to enforce age restrictions
Instagram
Meta-owned platform subject to age limit; uses AI classifiers for age inference from user content
Snapchat
Classified as social media (not messaging app) under Australian law; subject to age limit restrictions
Google
Subject to news bargaining code; received hundreds of millions in payments from Australian government
Yoti
Third-party age verification provider contracted by platforms for facial recognition-based age estimation
KID
Third-party age verification provider used by platforms for biometric age estimation services
WhatsApp
Messaging app excluded from age limit despite social features like stories and broadcast messaging
Telegram
Messaging app excluded from age limit legislation despite social functionality
Roblox
Gaming platform excluded from age limit despite youth safety concerns and social features
Microsoft
Copilot chatbot used by Australian public servants; subject to US data access and jurisdiction concerns
OpenAI
Australian government signed contract with OpenAI; subject to chatbot age verification requirements
Palantir
Has contracts with Australian government; represents US tech dependency and data sovereignty concerns
News Corp
Launched 'Wait Kids for Kids' campaign backing age limit; has commercial interest in news bargaining code payments
ABC
Australian public broadcaster; mentioned as potential alternative platform for youth engagement
People
Cam Wilson
Associate editor at Crikey; co-author of Conspiracy Nation; primary guest providing expert analysis on Australian soc...
Paris Marks
Host of Tech Won't Save Us podcast; conducts interview and provides context on international tech regulation trends
Anthony Albanese
Australian Prime Minister; committed to age limit in May 2024 radio interview; received criticism from teens on socia...
Jonathan Haidt
Author of 'The Anxious Generation'; book influenced Australian state premiers to pursue age limit legislation
Joe Biden
Former US President; promoted AUKUS defense deal with Australia; context for US-Australia tech relations
Quotes
"But yeah, there was that thing where we just missed this like incredible opportunity to do something less powerful to big tech. Let's actually force you to, if you want to like, you know, have youth users and perhaps even users who are older to work in these ways to our expectations. To me, that's actually harder on big tech than it is just to say, I'm going to ban."
Cam WilsonOpening segment
"I think that like having a reform that just says no matter what if you are a piece of social media like if you're a social media application you are banned is like fundamentally kind of just condemning the whole internet and any potential as like broken whereas i think like the point is we should all be thinking about this whether we're legislators or just like normal people who are like trying to set up our own communities like how can we use technology in a way that is not harmful"
Cam WilsonMid-episode
"I really do fundamentally think that big tech has really failed to take care of its users over the last two decades i think in particular young people have been like really, really failed. And I don't trust big tech to implement the best ways to take care of its users without regulation."
Cam WilsonClosing segment
"I would definitely start out with a lower kind of accuracy and confidence to be like, ramp it up, then unnecessarily kick a lot of people off the platforms to then cause a fuss."
Cam WilsonImplementation discussion
"Whether you think a ban will work or whether you think a ban won't work the way that australia has implemented it has been really really shoddy in fact actually because i was like quite critical and covered i think flaws in the process people just kind of assumed that i am inherently against the ban but the case that i always say is like if you support a ban you actually want to be figured out and implemented in the most solid way"
Cam WilsonClosing advice to other jurisdictions
Full Transcript
But yeah, there was that thing where we just missed this like incredible opportunity to do something less powerful to big tech. Let's actually force you to, if you want to like, you know, have youth users and perhaps even users who are older to work in these ways to our expectations. To me, that's actually harder on big tech than it is just to say, I'm going to ban. Hello and welcome to Tech Won't Save Us, made in partnership with The Nation magazine. I'm your host, Paris Marks, and this week my guest is Cam Wilson. Cam is an associate editor at Crikey and writes the Sizzle newsletter. He's also the co-author of Conspiracy Nation. As I'm sure you've seen in the news, Australia has this new age limit on social media, limiting anyone under the age of 16 from accessing social media platforms in various ways. And that has stirred up a lot of discussion, both, of course, within Australia, but around the world, as many other countries and governments have looked at pursuing a similar policy, in large part because of Australia's leadership, because it has moved first and, you know, kind of set the mold for what this might look like. And after a year of discussion, that age limit has finally taken effect. So I figured it would be a great opportunity to have Cam back on the show so we can discuss what this actually looks like, how things have developed over the past year, whether it looks like this age limit is actually going to be effective, and what we would really need to do to take on the harms that social media has caused, the concerns that we have with how these platforms work, and the impacts that they have on their users, and whether the age limit is the right way to go about it, or whether we should be looking at other regulatory or policy measures if we were serious about trying to rein these things in. And because I had Cam on the show, we also talked about some other issues in Australian tech policy, in particular, whether chatbots have affected this discussion in any kind of way, how the digital sovereignty discussions are playing out in Australia, and what the effects and the kind of bullying from the Trump administration are looking like down there and how governments and people are responding to it. So overall, I would say I think that this is, you know, a really fascinating and important conversation, especially as the discussion around these age limits on social media are not going to be going away. And I think it is about time that our governments did actually take actions to rein in the harms of these platforms and to expect them to operate in ways that align more with the values and expectations that we have. But then that leaves open the question as to whether the age limit is the best way to do that or whether we should be pursuing other regulations and measures to achieve those goals. And I think Cam gives us a really great insight on that front. And so if you do enjoy this episode, make sure to leave a five-star review on your podcast platform of choice. You can share the show on social media or with any other friends or colleagues who you think would learn from it. And before we get into this week's episode, I just wanted to let you know a couple of things. First of all, we'll be doing our end of year live stream for Patreon supporters on Friday, 6 p.m. Eastern time, 3 p.m. Pacific. That's 10 a.m. in Australia. It will be Saturday morning there. And, you know, it's quite late in the evening in Europe. But if any of you are up late, you know, 11 p.m. in the UK, midnight in Central European time, you're welcome to join us as well. And I would love if you did. So as I said, you have to be a Patreon supporter in order to join the live stream. but we'll also be releasing that discussion as a podcast episode the following week. So if you do still want to hear us chat, you'll be able to do that. On top of that, I think it's about time we do a mailbag episode or, you know, to try that. So if you do have questions for me about, you know, technology, tech policy, whatever, you know, it happens to be, we'll kind of pull some of those and put together an episode of questions, essentially. We'll see if you're interested in that. So if you do want to ask a question, if you're a Patreon supporter, you can send me a message through Patreon if you want to do that. If you're not a Patreon supporter, we'll put together a little email address where you can send those questions. And that will be mailbag at techwontsave.us. So mailbag, all one word, M-A-I-L-B-A-G at techwontsave.us. And I'll put that in the show notes if you want to find it as well. So send us some questions before the end of the year, and maybe one of them will be included in a mailbag episode. And so with that said, if you do enjoy the work that goes into making the show, if you enjoyed all the great conversations that we had throughout 2025 and are looking forward to what is coming in 2026, certainly consider becoming a paid supporter where you'll get ad-free episodes of the show. You'll get access to that live stream that I mentioned that will be happening. You know, if you listen to this on Thursday, it will be happening tomorrow. And you can also get stickers if you support at a certain level where you'll be joining supporters like Sagar from India, Paul in Lismore, Australia, and Antoine from Quebec City by going to patreon.com slash tech won't save us where you can do that. Thanks so much for your support in 2025. Hopefully you'll consider becoming a Patreon supporter. And either way, please enjoy this week's conversation. Cam, welcome back to Tech Won't Save Us. Hi, good to be back. Yeah, great to talk to you, you know, from the other side of the world. I'm not down your way at the moment. There are a lot of people paying attention to Australia right now because you guys are like the first movers, you know, the people who are going at first on trying to limit youths basically from social media, the under 16 social media age limit that we've been hearing about, not just in Australia, but now other jurisdictions are looking at following suit and doing something similar. So I was wondering to start, can you just give us an introduction to like, what is this social media age limit? How did it come about? What is this thing supposed to do? Australia last week introduced a legal expectation that social media platforms will take reasonable steps to stop Australians under the age of 16 from having accounts on their platform. It applies to all platforms that fit a very broad definition of social media platforms, but the government to give some kind of certainty said, we're giving you a list of 10 platforms that definitely fit this requirement. And that is all the major ones you can think of Facebook, Instagram, TikTok, YouTube, which was a kind of controversial inclusion at points, Snapchat, a few others as well. And the war itself was passed a year ago. It was given a 12-month lead-in time to figure out the details as certain aspects of implementation were figured out. At the same time, the government also ran a government-commissioned trial to look in age check technologies. And that war was actually passed after a pretty quick process where earlier in 2024, there was a little bit of support around from some state premiers, like heads of Australian states for something like this. And then in a radio interview, our prime minister was asked, do you support a campaign, which was launched that month by like a group backed by a popular radio station that said, hey, we want to raise the minimum age of 16. Simultaneously, there was also another major media organization running a very similar campaign. That was the prime minister said in May, yep, I back it, I support this. And then six months later, it was a war. So we saw a pretty quick process in a way for it to actually be locked in. And the actual details of implementation, which kind of went back and forth, and there's lots of questions about how do you actually take this idea, which is a very kind of broad war and actually implemented, took a little bit longer. and then yes, starting last week, that was the deadlines that platforms had to do something. And we did see widespread children's accounts being restricted. We also sort of saw widespread circumvention. So it definitely didn't go off without a hitch. I'm sure we'll get a chance to talk about that later on. But now, you know, as of this week, we're now almost a week into our post-social media world for teens and, you know, largely, I guess life has continued in a way. I think that gives us a great introduction to it. And there are a number of things I want to pick up on in what you were saying, But I want to start with the origins, right? Because I know last time you were on the show, we talked about how certain media organizations were pushing this and there were like certain parental groups that were supporting kind of an age limit on social media. And I know that you've also been reporting recently about particular kind of interest groups that were pushing this. So what do we know about how this effort to do an age limit on social media actually came about? Like what were the main, who were the main movers here in actually getting this to happen? Yeah, we're all kind of existing in the same kind of global swirl of everything that's happening. And a book that many people will be familiar with, The Anxious Generation by Jonathan Haidt, is kind of like one of the early dominoes that tipped this off. So the wife of one of the heads of an Australian state read the book, told her partner, her husband, that maybe we should look into this and do something about this. And he was early on to say, yep, we want to do something like this. And he actually commissioned his state to do like a formal legal review of how to make it happen. Then another state kind of joined in. And that was around the time that they started really getting started in Australia in early 2024. Then these two campaigns came out. they were pushing for this at a national level to say, let's raise the kind of industry standard de facto minimum age of 13 to 16. And both of those essentially you could say were backed or came from mainstream media organizations. So one was News Corp. So you're very familiar with that, I'm sure. They ran a campaign called Wait Kids for Kids. And they were campaigning for this, you know, as a news outlet, they were saying directly, you know, we are campaigning for this. And when it happened, they took credit for it. And the other one was this other group called 36 Months, which was a group co-founded by a radio host of a popular FM radio show and also a guy who runs a firm that produces video, including advertisements. And these two groups really, it was on the radio show of that co-host where the prime minister who went on committed to raising the age or said he's, I support it. And, you know, like, obviously six months later it happens. So you can kind of see that as, I guess, kind of indicative. But, yeah, I mean, both of those came from, and I would say the campaign broadly had quite a lot of support from traditional media organizations. Now, the one thing that kind of gets brought up a lot is, as people might know, Australia, in terms of, like, some of the interesting internet regulation we're doing generally, one of the things that we've done since the early 2020s is that we have forced Facebook and Meta to pay Australian journalists similar to the News Media Bargaining Code. We're kind of first on that. I think Canada tried to follow in a couple of the places, California as well, I think with less success. But the reason to raise that for context is that when we passed a war initially that led to hundreds of millions of dollars over three years going from Google and Meta towards news outlets, those agreements that were initially signed were signed for three years. Those deals had kind of come up at the start of last year. and at least Meta had indicated that they weren't wanting to pay these anymore. And so it's kind of in the midst of that that we saw this really strong campaign, particularly for a news call, to coincidentally focus on the harms of social media. Now, I should say, like, a news call, people have denied that. You know, they said that that wasn't the cause of it. And, like, you know, to be sure, like, these outlets have always covered the ills of social media and, you know, the kind of tabloids and, you know, what tech is doing. You know, kids is obviously a very popular story no matter what. But, yeah, of course, like it has been kind of noted that has been involved. And I think like why that's kind of interesting in the context is that neither of these groups really came from like academic, you know, this is a push from, you know, experts on the ground, mental health groups. It was more of a push through a kind of, you know, grassroots support. And to the credit of these campaigns, like obviously I've said they've been really backed by media organizations. They weren't like by any means just like elites. like there was the group that i was talking about that's been backed by um the radio host it's called 36 months you know in the time between when they launched it in may and when the the war passed in november he had like 127 000 signatures supporting the move on change.org so like they were able to tap into this groundswell support for really around this like idea that i think a lot of people have been like i hear from politicians actually quite regularly that particularly since the covid19 pandemic, there's been a lot of angst in the community about young people, young people's well-being, and particularly at an intersection of them using technology. So we had these campaigns, I would describe them as a very broad populist push that kind of was able to both tap into like this community sentiment, really effectively used, for example, a lot of really devastating stories about parents who lost children and said that social media was kind of contributed to that. And a really simple like ask as well, I think it's like such a big thing about this campaign, which is like, you know, tech regulation is so complex and Australia would be doing some interesting stuff, but it's often like hard to explain. It's very simple. Like social media is bad. We've seen welfare kids get worse. Why don't we ban it for a certain age group? And that ultimately was successful. I just want to pick up on what you said about the news bargaining code there. As you mentioned, other jurisdictions have tried this as well with different levels of success. You know, in Canada, Google made a deal, but meta just blocked news entirely. And as you say, I know that they mentioned that doing the same down in Australia. Were the deals renewed in Australia or what happened there? Not yet, essentially. So there is actually a push to do a renewed version of the war that kind of, not to get too much into the weeds of things, but the war didn't say you have to pay these companies. They just said, if news publishers don't enter into deals with these platforms, they'll be forced to go to an arbitration. That was essentially like the risk that if you guys don't come up with enough deals that makes the government happy and feel like it's things have been sorted you will potentially go into a kind of negotiation that could end up very poorly for you so that that was kind of done over the last couple years but you know at the start of last year meta said we're not doing this anymore like we think you're essentially like the dynamics of change we think the value has changed and so we're not you know and and we have also as a platform you know withdrawn or kind of dialed down all the news content on there so that was the kind of context in which this is kind of happening Australia doesn have this new war yet it still being considered Of course the complicating factor in all this is the Trump administration and how it is wielding the US government force and the threat of force and trade using the kind of trade relationships that we have. That has kind of complicated the idea that Australia is going to force these companies to then pay again, because obviously we know of the cozy relationship between big tech and the Trump administration at the moment. So going back to the social media age limit, right? You know, you were saying it's now in effect, you know, the government has has rolled this out. And I know that there was this kind of big experiment with like different technologies, how it might work, how implementation might look for something like this. And as you mentioned, there there was an age limit on social media before of 13. There were obviously questions about how well that was enforced or whether the companies really did much to enforce that. So what does it look like now to keep those under 16 off of these social media platforms designated under the law? like how does that work in practice i think the best way to understand this law is it's two parts the first one is this idea that we're raising the minimum age from 13 to 16 in australia that was just a kind of companies do that you know my understanding is that is based off on u.s law and u.s legislation about kids online safety over there that became the international de facto standard um so there's the saying we're going to raise this from 13 to 16. the second part of the law is this saying that these companies actually need to enforce this otherwise they face fines of up to 50 million dollars for systemic failure to do it so a big fat fine to whack you if you're not actually doing this kind of thing and i think that's a really interesting thing to note because like obviously a lot of this debate has been around the age and that is something that i have kind of focused on because i think is probably the least based in kind of evidence from from the groups and individuals and academics who study this the most but in terms of the like enforcing it like the dirty secret of big tech for so long was that they had these rules and they were flouted by like a lot of people like it was i mean there are statistics from the australian government which has done a lot of research with young people and how they use social media to be like you know incredible amounts of people under the age of 13 under the age of like nine are on social media and that's because for a lot of the modern internet we've just often relied on a system you know come to a kind of page that's what age you and you say i'm you know 65 or whatever and they just take that as gospel and obviously you know like i'm sure we'll get into in later on like i've kind of been critical of some of the parts of the process that have led us here but i think you know while part of the kind of seeds of this as i alluded to before are in the the people getting um worried about their own use and their children's use of technology during covid when we went through this pretty drastic, I think, acceleration of tech habits and the engagement, like how mesh it is in our life. But the other scene of this is just the fact that like for years and years and years, these companies have not actually really enforced, have not put a lot of resources into stopping people under their minimum age from being on these platforms. And I think it's very hard for them now to be like, to look back and say we were in good faith being like, we've been trying to do this well because it really has not been the case. So to directly answer your question, like there's kind of, I would say, three main ways in which this is done. And we'll go from like the least onerous to the most onerous. So the first one is what they kind of called age inference. And that's a fancy way of saying figuring out your age from data that we already have. Most people on social media platforms have already given an incredible amount of information to these platforms purely based on how they use it. everything from, I mean, which is obviously the date that you put in there, the date of birth that you put in there in the first place, to your email address. I mean, there's not many kids under 16 with a .gov email address or .gov.au. Then to things like using the images and content that you've posted online. So major platforms do actually have, like, you know, meta and has had for a long time what they call AI classifiers that reviews content to see if there are signals that suggest that you are over a certain age. So for example, if you're posting on Instagram, just had like the best ninth birthday, like that's a signal that they can pick up and be like, you're probably not 65 as you told us initially. So all those kinds of things are going to this quite passive system that many of these platforms have had to be like, here's your age. So two more like onerous systems. And these are the ones that you kind of graduate to if the platform has reason to suspect that you might be under 16. The next one is what they call like age estimation. And these are the ones that you're probably most familiar with in terms of using biometric information. So facial scanning has become a really major part of how Australia is enforcing this law and what the kind of expectation has been. I think all the major platforms, maybe not all, but most of the major platforms are using that as a core part of how they figure out people's ages. So a lot of them have outsourced this to third party providers like Yoti and KID. But essentially they say, you can use your phone, your computer, you can do a short, sorry, an image or a short video that will then analyze how you look to figure out your age. We think of our faces as very, very linked to our identity. If done in best practice, this has the capacity to, for example, figure out your age without necessarily needing to knowing who you are. You know, the scanning that they're doing isn't saying, this is Cam Wilson, and I know that he's like 35. It's saying that this person has the facial features of someone who's 35, and so they can go on through. You know, these systems aren't perfect, and we've heard widespread examples of teens scrunching up their faces to look at their wrinkles, putting on makeup, putting texture over their eyebrows, things that will trick these systems, which are inherently vulnerable to just people making alterations the way they look to suggest that they're older. So there's that. And the final kind of option is this idea of uploading government ID. So driver's license, passports, those kinds of things. And that's really the one where it's like, say they're unable to figure out from your data or you know by your face or scan that you are over 16 or they've just decided that you're not and you need to contest that then there's the ability to actually upload you know real documentation this is called like age verification because like obviously documentation is something that you can really verify you know you can have to check it against an official source and that's kind of the final way to use it so it's a kind of like as they call it a waterfall system so the idea is that and this is what the government has instructed these platforms is to be like we don't actually want you to go to the most onerous thing yet we don't want you to require every person to upload their id because while that might be simpler for you because you won't get like fined we think that's not a good user experience we would actually prefer that you kind of see if you can do it in a way that is less onerous that is getting less information because of course like while you may have already volunteered a lot of information these platforms providing id is also another form of information more data that exposes you to risks. So they're saying, we want you to try and use the less onerous ones, the less intrusive versions, although what you might think of as intrusive, various of person to person, before you then kind of graduate up that scale to the ones that are more short. I would assume that many of the platforms, as you were saying there, ID is not the main thing that they are focusing on. They're trying to use these other systems, at least first before asking for something like that. Would that be the right way to understand it? Yeah, that would be the way to understand it. And I think because the thing is, as often it's said, and I'm sure most of your listeners would know, is to stop people who are under 16 from getting on platforms, you need to figure out everyone's age, right? But when December 10, that deadline came and went, there was not all 28 million Australians who are on social media. They were not asked to upload their ID. Most of it actually happened behind the scenes. Most of the initial age checks that were being done with these platforms were largely done by data inference. And so one of the criticisms, of course, to this legislation and these kinds of approaches is that you are asking everyone to give a whole lot of information that they haven't had to before but in the execution of it at least from the Australian perspective they a lot of people just didn't have to do anything more than what they've already done it's kind of funny because I think like you know in a way it's almost like the government and the tech companies didn't want to like publicize this too much because if you think about it it's such a kind of a couple to be like oh you can like with pretty reasonable accuracy tell my age without me having to do anything else that kind of is a bit freaky but that was actually one thing they were able to do and use like some of these technologies to mean that they could for a lot of the time not have to ask for any more identification for someone to be able to be like you're probably over 16 so we don't need to restrict you yeah and so then i guess it's just in certain instances where they need to move forward to the facial scans or you know the biometric checks or even something like an id i want to pick up on more aspects of the process of this law coming into force. But I think before we do that, it's worth just asking, as you said, this is in force right now. The companies are rolling this out. The government is expecting them to do this. What has been the response now that it is actually there? Does it seem like it's being effective? How are young people responding to this? What are you seeing on that side of it? I think it's very hard to tell. So I'll tell you how it went. The first day that it happened, There was widespread media coverage about how teens are easily able to get through. And that is definitely true. There are definitely some portion of teens who are able to get through it. If you went on any of the social media platforms, you would see them all over it. In fact, if you went to the prime minister's like TikTok or Facebook account, you would see his comments flooded with teens being like, I'm still here, like, try and get me. I saw this campaign that was like, maybe it was even you who posted about it, where young people were like, unfollow the prime minister. And I was like, man, how many teens are following the prime minister in the first place? Yeah. And I think like, you know, thinking critically about it, you're not going to see from teens who've been like kicked off platforms that they've been kicked off because they're not there anymore. And we know that there are like more than a million kids in that age group who are supposed to be kicked off most of them with like social media accounts. So we haven't actually got numbers yet from the platforms and I'm told that we're expected to get them in the next like week or two to see how many accounts that they've got kicked off. But it's one of those problems like, how do you measure something that's trying to evade being measured? Like, how do you, like, find out how many teams are still on there, you know, refugees somewhere who are trying to get around the band? I don't really know. I mean, I know there's going to be surveys and stuff on Alfred. So that is, of course, like the way that you kind of do it more broadly. But, I mean, like, the first day there was widespread circumvention. People were talking about all the easy ways they were getting around it. So I think that, like, in the first instance, there was a lot of, like, I think it confirmed a lot of the people's suspicion that these techniques might not be that successful. That being said, like, I do think there is a case to be like, you know, this is day one. There's obviously going to be an ongoing thing. I think the systems are going to get tweaked and better. So, you know, I would imagine that this ends up being more and more effective over time. But we'll kind of see. So that was like the first day. But then kind of since then, I don't think there has been like a whole lot. I mean, like since the first week. And I think like what's important to know is also the absence of stories, which is that I did not actually end up seeing that many people saying, I am over 16. I got locked out of my platform, out of my account. I mean, unable to use these. And so I think for those reasons, like if I was just like, I have nothing from this, like from the government side, like all the tech company side, because I don't think that would be too loud about this. But like, if I was imagining it, I would definitely start out with a lower kind of accuracy and confidence to be like, ramp it up, then unnecessarily kick a lot of people off the platforms to then cause a fuss. And there was some like people in the air, like that premier who I spoke about, he kind of kicked it all off. He kind of said this, that 36 months group, they kind of said this, they were saying, we expect big tech to actively sabotage this because we think that they don't want this spread around the world, which I understand. But I also think like, you know, for these platforms, their users, and this is all they're using, not just the 13 to 16s. That is like the fact that they're on there, that network effect is the thing that like, that is their business model. And so if they were trying to make this like, unnecessarily onerous, if they were going to try and make this a pain, that would actually also hurt them even more. So I was always kind of suspicious about the idea that they would essentially try and fuck things up just to make this look bad. If anything, I would, I suspected that they would try and make this as like easy as possible. And that maybe means that some teams are getting through and maybe we'll catch them later on, rather than having a kind of widespread change, which would ultimately really like ruin the user experience. So all that to say is like that happened last week. A lot of people got through. I think we'll kind of like we really have to wait and see how it goes because that's kind of like day one. And I'm not like to repeat government. Of course they want to say because day one didn't seem like it went that well. I think it's very, very hard to know. And I think sometimes when people like we count up as a failure or whatever, it's kind of premature considering like presumably this is something that is going to be in place for a long time. Yeah. And I imagine if you're the platforms, you could easily kind of watch the comments of Anthony Albanese's Instagram or something like that and see what teens are saying they got through and knock them off one by one. So based on what you're saying, then we're not very far into this. It looks like some people got through, but also that a lot of people who wouldn't be under the age limit aren't getting hit with it. Does that mean at this point we can say that it looks like it's not effective? It looks like it is effective? Does it look like kind of maybe some of the concerns leading up to the implementation were maybe overblown or maybe they're accurate? Like, is it too early to get a read on any of those things at this point? It's hard to get a read. And I think like, you know, the thing that this is why it's so hard and we haven't had a chance to touch on this yet, which is like, we have just a real mishmash of justifications. The prime minister started off by saying social media is causing social harm and we saw that message honed over time into a like argument about the kind of like you know they were called predatory algorithms and the features of platforms that were encouraging harmful use of their products which is to people stay on there longer radicalization all that kind of stuff essentially became like a touch grass policy like as in like get the kids off their dang phones and you know onto the footy fields or whatever the problem with that is that when they had to then translate that idea into practice, what we ended up with was this policy that while it had a very wide definition of what a social media platform was, it also had exclusions for things like messaging apps weren't included in that. And then like the following question is like, what's a messaging app? And like Snapchat was ruled as not a messaging app, but like WhatsApp and Telegram were. And like WhatsApp, I don't know if people are familiar with these features, I don't really use them that often, but it actually has like a lot of social features like you've got stories you've got kind of like a facebook page style like mass broadcast stuff gaming was also excluded as well and particularly in the idea of in the world of like youth safety at the moment like Roblox has been such like a hot button issue And Roblox of course is excluded because it a game And then also the fact that like the way the law was written was that it only applies to people using accounts on the platforms which allows those platforms still to be used in a logged out state You can still watch YouTube all you want. You can actually use the TikTok app without being logged in and you will still get customized like feed. you still get algorithmic recommendations. So when they had to translate this app, like this war into something that actually addressed those things that they had raised as the issues, you kind of saw like how it almost drifted away from that in a way, because it's like, okay, great. So you want kids to be off their phones. And yet, like, for example, you can still watch as much YouTube and TikTok as you want. You can still play Roblox. You can still do like all these things. I would not be surprised if we didn't see a drastic change in kids' screen time, because like maybe they'll go outside but i think a lot of them will just substitute like one thing for another where whether it's like going to snapchat to whatsapp whether it's going from like logged in youtube to logged out youtube so for those reasons like the success of this as a war while like partly is about like how well it's keeping people out you know there's certainly questions over at the moment but we'll have more idea later on i don't know like how well it will work and i also think maybe they just kind of ratchet it up as it goes along maybe they catch more and more people and restrict them more and more from these social media platforms that they aren't allowed to be on but these other purposes for the law because they're not even addressing it's just a very bizarre to kind of understand the effectiveness of and so like age verification technology and age estimation technology which is such a hot button issue really to me became only like one part of how we understand how effective it is and how we look at it in the future and particularly as the war came close to being into effect, we saw a real change of the goalposts by the government to go from being like, we're going to do something about cyber bullying. We're going to do something about radicalisation. This is going to do something about kids spending too long online and not getting enough sleep to eventually, a week before the war came into effect, the prime minister said, this war is already a success because it started conversations. And they said, the point of the war is to be, is to change like social expectations. So not everyone will just assume that everyone else is on social media at that age. And to me, I'm just like, I feel like that is such a drift of the law from its original purposes. It's such a low bar that you kind of are questioning, like what is the point, why have we ended up doing it like this? Because if that's that success, I guess you could argue that there's been plenty of conversations. There's been widespread usage. And I think, you know, like to the credit, I've definitely thought about my screen time. In the meantime, I definitely thought, more than I thought in a long time about how kids are using social media. In terms of effectiveness, and maybe we'll get a chance to chat about this later, I have just been critical about it because I think Australia went out on a limb, did something that got a lot of attention, has tried to stare down big tech who's opposed it, are facing legal challenges over it. But the reform that they pursued, this broad ban, which is marred by questions about how effective it will be in terms of stopping kids and then even the kids who are stopped, how effective it will be at changing those habits. They did that rather than doing... There is actually a whole bunch of other tech regulation that if you wanted to be bold and do something new, you could do, but we chose not to do to pursue this thing, which is easy to sell, but to me, it's not as effective as some of the other stuff that I think they could have done, but chose not to. Yeah. And I would definitely like to circle back to that a little bit later to talk about some of those things, right? And one of the things I really appreciated about your reporting on this over the past year as it's been evolving is that, you know, it's clear that you are concerned about the impacts of social media on young people, but on people more broadly, right? You know, you were talking about thinking about your own screen time. I've been doing the same thing. And, you know, whether this is really the right policy to address those issues, especially as you're talking about when the government is talking about so many different things that this is supposedly going to take on, And it's not clear how it would actually do that. And I was interested in kind of like, you know, the process of this law coming into being and how it has evolved, because through your reporting, I know that, you know, there was a deal between the Labor Party, which is in government right now, and the right wing coalition that removed some like key parts from the original legislation that would have potentially been, you know, quite, quite helpful there. And that there was also concern on the side of the e-safety commissioner, which moves a lot of these kind of digital regulations forward, that some other critical features were not in the bill. So can you talk a bit about the legislation itself and, you know, what was missed there that could have really been helpful? Yeah, sure. I think this speaks to one of the reasons why this has been a political and populist kind of policy, even though there was a lot of people working on it who had deep expertise and a good understanding of like platforms and how technology works. so we've got this band right like kids under 16 kind of accounts great there was originally a part of the bill that was called the exemption framework which was essentially like a get out of jail free card for the platforms presuming that they did that they got rid of a number of features so this idea was like you know we said oh we're worried about kids using these platforms we think that the algorithms are radicalized and like top level stuff you know that you hear about social media platforms all the time but then rather than saying just bang them they said if you got rid of some features and those features could have included things like endless scroll push notifications gamified like snap streaks things like that they weren't set in stone but like it was the idea that like you can come up with some things that you think are parts of social media that are doing the harm if you get rid of those and if you release a version essentially of your app that doesn't have that then you can get through the ban and and the kind of parallel to that is something like messenger kids so uh meta which formerly facebook like has a kids version of facebook messenger that like you can still like chat to other people on the platform but it's limited to what you do you can limit what you chat to and it can be a link to a parent's account and all this kind of stuff but that was something that if you think about it is like a kind of you know having there is precedent for having an alternate version of these major platforms that we've released for young people that we've decided that certain things they should they shouldn't be like able to have that they should have a curated and different experience that was originally like in the act and it was like consulted on when they developed it in fact like in australia when you when you introduce the law you've got to introduce this like analysis and this is like the point is to be like to kind of explain it outside of critical sphere it's produced by public servants it talked about this in fact when the government actually briefed out the law in the first place so they said we're introducing this law into parliament two weeks later they introduced it to parliament it no longer had this this exemption framework that was even told to the media when it was first announced that it would be there so it got taken out and like i reported that this was because of a political deal between um the major governing party and the conservative opposition party my understanding is because they were just like concerned that tech companies would be able to kind of game the rules somehow and like able to keep their apps in the hands of kids or whatever without actually addressing the But for me, and like speaking to, for example, people who worked on the law, who drafted the law, they were just like, it was like a gut punch to them because they were like, instead of taking this law that we said we're banning kids, but we're going to give tech platforms a way out if they produce versions of their applications that are in line with our expectations that we think by our rules are less damaging to kids. we create an incentive structure for them to actually do something being like if you still want to like you know be able to have younger users on this platform all the platforms are always constantly worried about new platforms popping up with young users in particular you should create children friendly versions of it and i think like for me the reason why this is so interesting is because kind of taking a step back and part of like my my framework thinking about all this stuff is that I don't think that the idea of social media is inherently a harmful one. Communicating with other people, like organizing and I guess like mobilizing people online through technology, for example, like you and me right now, Paris, or you and I, the listener, being like on the other ends of a piece of technology that makes it like inherently more harmful. It's the fact that we have these major platforms that have incentives, that have dominated the space that have like essentially very limited real competition even though they're technically endless competition but in terms of like the compounding effects of like you know the network effect and all that kind of stuff means that like like it's very hard for anyone to go from like facebook to another platform because like all your friends are there or whatever i think that like having a reform that just says no matter what if you are a piece of social media like if you're a social media application you are banned is like fundamentally kind of just condemning the whole internet and any potential as like broken whereas i think like the point is we should all be thinking about this whether we're legislators or just like normal people who are like trying to set up our own communities like how can we use technology in a way that is not harmful what incentive structures can we do or how can we ourselves choose to be part of places online that aren't like harmful this part of the law that i thought could have directed social media platforms with real incentives in australia to be like if you still want to access like millions of kid users if you don't want to go to these other platforms you need to do xyz that was abandoned and i think like fundamentally that is kind of like goes to even though like over time i'm sure we can talk about it but i will say the listeners from hearing all the twists and turns in it is that pushing back towards that broad ban and not giving a way out kind of leans into this idea that no matter what technology is going to be is something that either like it's kind of puritanical idea it's like absence. You just need to be off there altogether until you're on there like later on. And that's fine because we also like, once you turn 16, you're fine. Rather than being like, can we use our power as a country to like twist these platforms against their will into doing something that actually might be a better experience for kids and for everyone. I think this is key, right? And I think it gets back to something that we were talking about last time you were on the show to discuss the prospect of this law at the time, which is that I think we're both kind of skeptical that an age limit is really the way you would want to address these issues, especially when they're issues that are not solely ones that affect young people, but affect so many people at so many different ages, right? And one of the things that I took from your reporting was this notion that there was potentially a missed opportunity or an opportunity cost of pursuing this age limit legislation. And over the past year, when there were other initiatives that could have been regulatory or whatnot that could have been taken, that were actually being developed, I believe, by the e-safety commissioner, but you can correct me on that, that would have tried to address some of these more harmful features of the platform, but that then kind of like fell by the wayside as the whole discussion and focus became this age limit. So can you talk to me about that aspect of this? Australia has been doing some like innovation, new things around internet regulation in Australia for a while. And I think a lot of it just didn't really get noticed. And then when this ban came about, some of the ban stuff literally trampled over it and made it either redundant or kind of muted it because at the same time as it was like I would say a more nuanced approach to it then they just decided hey we'll like battle the kids and that's that so for example like the online safety act which was introduced in the early 2020s was on this kind of like long process of introducing kind of like basic regulations and stuff around how platforms like social media platforms but also outside of social media like search engines all these other ones would deal with certain kinds of content including like violent content explicit content like all that kind of stuff and that was kind of entrained at the same time like this government as well had previously passed privacy reforms that was introducing a children's online privacy code which would again restrict what these platforms could collect on users and and you know that goes some ways to as well stopping some of the like more harmful targeting of young people and that's still going but now because of like the ban like obviously a lot of those platforms that would have come under it now obviously don't need to abide by it because they're not supposed to have children on these platforms so there were definitely things that are happening and you know one of the things i think has been such a failure of this is that like very often the debate around this has been like ban or do nothing whereas like the truth was not only are they like interesting things that we could have done like you know i kind of mentioned before like a lot of the rhetoric recently has focused on this idea of like predatory algorithms and like features of these platforms that could definitely be addressed through regulation like there is actually like simultaneously a campaign launched recently called fix our feed which is by a well-known australian sex education group which is saying not only do we want to have like forceful platforms to have a algorithm three which i guess would mean like chronological feed for social media but also we want to make that like opt out by default so people would be shown the chronological rather than algorithmic again like i'm sure there's funny criticism and like ways that can be tweaked but like that's the kind of idea that's happening but instead of doing something like that like not only do we not go with a different reform like that when we had this kind of courage to do something bold and potentially change these platforms but we also trampled on these things that were already happening that i just don't think that many people know about and you mentioned the e-safety commissioner the e-safety commissioner was kind of keeping track and was like overseeing that whole process which was slower and also i should say like does have criticism from digital rights groups because you know that it doesn't go far enough and the way in which it is written. The point is just to say, whether it's from the country at large to individual families, I'm just worried that people will be like, we passed this law and that's online safety dealt with. And not only will they be like, well, I don't have to worry about kids on social media anymore. For my own kids as a country, we've dealt with that as well. But yeah, there was that thing where we just missed this incredible opportunity to do something in my mind that is actually less powerful to big tech rather than just saying we ban kids, but kind of everyone has the same ban and everyone picks back up at 16 or whatever to be like, let's actually force you to, if you want to have youth users and perhaps even users who are older to work in these ways to our expectations. To me, that's actually harder on big tech than it is just to say a blanket ban. I think that's such a key point, right? And I think that that's the direction we would want to see this go And it would be great if Australia could be a leader on that right I want to touch on some bigger questions before we kind of end off our conversation And I was wondering obviously this whole process has been focused on social media, right? But I feel like over the past year or 18 months since, you know, this process really kicked off, we've been talking a lot more about chatbots, right? And how we're seeing many of these issues or many distinct issues coming from the use of chatbots, whether it's from young people or even older people, has the conversation evolved or changed as a result of generative AI and the chatbots over the past year or so? Yeah, I mean, I definitely think so. I mean, I think it's interesting. We were talking about those other internet regulation that's happening. I think Australia was one of the first places that through that regulation process, the online safety code, industry codes, our safety commissioner has now required all chatbots, it starts in March but it was set into effect in September to age verify their users and also to prevent minors so non-adults from having either like sexualized or graphic conversations with their users and I think that Australia was probably first on that it was a great example of like how with these initial regulations like that kind of thing can happen subtly not subtly but like not necessarily like you know a national campaign push you can kind of go forth so it's definitely something that like i think again australia is kind of looking at in an interesting way in maybe not the most like high profile way i think like there has been some question about kids without social media will they then turn to chatbots around the world we're seeing like some pretty high levels of uses of chatbots by young people i haven't seen like a direct substitution from like teens being like i can't get on facebook so i guess i'm gonna go like i don't know sexed with grok or whatever but i think that's largely because like in the same way that i don't think we've actually ended up seeing that much, you know, like VPN use or even that much alternate social media platform. Like there has definitely been some, but not a whole lot is because the age verification so far has been quite porous. Maybe there's something to keep an eye on in the future. Maybe that's something to keep an eye on as younger users kind of age into an age where they've never had social media, but they now rely more on chatbots. But I wouldn't say that it's been like a massive part of the discourse here. That's really interesting. But, you know, I wanted to see, right? One of the things I'm always interested in with Australia is I feel like it's so similar to Canada, but I feel like Canada is so behind on internet regulation. And so it's interesting to watch Australia try to be a leader, try some new things, even if they don't always work out properly. But I think it's good to attempt it, right? Good to try to make some moves. One of the discussions that I feel like has gained a lot of prominence in the past year since we talked as well, and you mentioned Trump earlier, is this notion of digital sovereignty, reliance on US technology and all these sorts of things. And I'm sure that this is a conversation that's playing out somewhat in Australia. And I feel like, you know, we have heard the prime minister mention things like this, even around the social media age limit, you know, around the dependence on these companies and all this kind of stuff. But I also saw that, you know, the government signed a contract with OpenAI recently. Obviously, we know that Palantir has contracts there just as they do in Canada. Are there much discussions about digital sovereignty going on in Australia right now? Is there a movement in that direction? What are you seeing there in response to what the Trump administration and these tech companies are doing? Yeah, I would sort of accelerate it into two areas. The idea of kind of when there is like major consideration from like, you know, mainstream media and politicians about the US, like I largely see that through the idea of retaliation towards like regulation by Australia. We haven't seen anything so far. Australia has been mentioned in a bunch of like White House orders I think as potentially you know passing regulations like the EU and stuff that they're not very happy about but as far as I can tell nothing so far has I mean we haven't been on the I think we've always had the most like favorable trade relationship to the US even under the Trump administration so nothing so far but very clearly the specter of that has been raised a lot it does seem like since the beginning of the year when Trump came into office, there has been, I would say, a bit of a slowdown in tech regulation until like about a month ago after our prime minister had a meeting with Trump and afterwards the war. So we had that news video bargaining code to the money towards journalists and news outlets. We'll say like a big thing has been content streaming quotas. So the idea that streaming platforms in Australia have to produce a certain amount of Australian content. And also So we kind of had some other tax stuff about global companies and how that affect tech companies in terms of paying tax here. The content quotas is one place where you're following after us, I think. We did that one first. Oh, yeah, yeah. Yeah. So we didn't have anything. But then all of a sudden, after the government had committed to a long time ago, but hadn't done anything, as was widely understood to be out of fear of retaliation from the US government, there was a bit of progress on that. So we are, I think the government passed the content quotas and some of this other tax reform was kind of happening. So there is some kind of progress on that, but clearly that's been like something that has been a big factor on it. And so I'm going to go back to like data sovereignty. I think there has been like absolutely no discussion at a national level about that from like mainstream outlets. I would say that Australia is a continued like closeness to US in terms of like defense. You know, we have this like AUKUS deal. If you might have heard Joe Biden love talking about it. Australia, we love talking about it. There is no, I would say, major consideration about how US tech companies might be using and storing data and what it means to have Australian data kind of subject to American whims, particularly under the US. This administration, in fact, Australia actually has a bunch of laws or at least like one or two that make it easier for US legal orders and stuff to be carried out. Yeah. So despite the fact that like we are uniquely vulnerable to say if like, I guess if the Trump administration wanted to do certain things about the data it holds, either literally in the US or by American companies in Australia, there's very much real consideration of that. the Australian government has recently announced, you know, like many other governments that it's very keen to work with all these tech companies. There is definitely some lip service paid to this idea of like data being stored in Australia. It's very murky and difficult to understand what exactly even like that means because it gets so complex to be like, what is stored in Australia? What's not? And even if it's in Australia, what is still subject to like US jurisdiction because of some of those laws I spoke about. I wrote a story a couple of months ago about, I think I was like the first Australian journalist to FOI a senior public servant or any public servant's use of chatbots. And so I got like a very, very senior home affairs. So like national security staff's use of chatbots. One of the kind of, I've been sort of like pecking on the Australian social media band. I haven't focused on a lot. But one of the questions that really came out of it to me was like, yes, I know that some data is stored locally. but if like Australian public servants are using it was Microsoft Copilot my understanding is like it can't be run locally what if it is kind of cordoned off from the US in terms of if it's all run in Australia if there's no like transmission back to your servers and even if it is to what extent would that still be accessible by American staff and then people who are subject to American legal orders I don't really know I think it's kind of a very complex question that people try not to think about and it's also kind of a little bit impossible to entangle as well. Yeah, totally. There was just like a big Microsoft announcement in Canada, like last week of this massive new data center investment that they're making here and how it's going to like enhance Canadian data sovereignty. And then in the announcement, there's even a section where it's like, if a foreign government tells us we need to hand over the data, we can't really say no. And it's like, okay, well, how is this? So it's wild. It's very funny because I feel like it hasn't happened in Australia. I think we saw it happen a bit in France and it has been happening, obviously, a bit in Europe and globally. I kind of see it as the reverse argument. Sorry, it's not the reverse. It's the same argument that's being made against TikTok by often very much these same US centers who are saying, what are these servers and what could be on there? And now it's kind of being used against them as well because it's like everything that you want to ask about who, like even if it is an Australian servers like which American staff have access over it what is what is the what capacity does America have to request that information etc etc it doesn't really get asked here because I think there's not a lot of questioning about it but like I think you definitely do get the sense from outside of that like because of course like a lot of mainstream media coverage is very like high level and so you can understand why they're not necessarily getting to that but I do get the sense from like talking to even like quite big like your staff are quite big Australian companies that there is like a kind of unease and a desire to be able to like extricate them somewhat from this because you might not get it talked about in Australian politics because there is like a pretty bipartisan commitment to like we are we're with America through and through and if you start to talk about questions out but what if they were like someone acting against like maybe like our interests that it would you know that's kind of something that isn't really countenance you definitely like when it comes to money and people thinking about that and all the kind of awkwardness of that i think like like they are really starting to think about it seriously and um it's like comfortable kind of thought one more point on that is uh after the AUKUS deal was announced there was like a lot of discussion in canada especially being pushed by like military folks about why canada wasn't included in AUKUS one good thing about everything with the trump administration now is we never hear any discussion about whether canada should join AUKUS in this moment so that's good okay if you want to be be like Australia and just give the US a bunch of money for submarines that never turns up. I'm sure you can do that outside of August. Yeah, totally. Totally. You know, we'll probably do that with F-35s or something instead. Final question for you, Cam. Getting back to this age limit legislation, you know, as we were talking about earlier, there's a lot of other jurisdictions who are looking to follow in the Australian example, right? Who are looking to do something different. What do you think the Australian experience should inform these other jurisdictions as they are looking to go down this route? You know, what lessons can they learn from what you have seen in Australia? And are there better paths to take than trying to do a hard age limit? Yeah, it's a good question. And I think it's even the other countries that are saying we are indicating our support are often doing things like lower ages. So like the EU is like we'll do 15. And they're also doing things like, but you can go on younger with like, for example, parental consent and that kind of thing. So I think like understanding that there is a whole wide range of ways to even approach if you want to do something like a ban or a minimum age and mess with that like there are a whole bunch of different ways to do it i think the thing that i would just kind of say is that i mean the way that our ban has worked out i just like this is the kind of criticism that i've had the whole time which is like and the criticism of how it's played out it's like whether you think a ban will work or whether you think a ban won't work the way that australia has implemented it has been really really shoddy in fact actually because i was like quite critical and covered i think flaws in the process people just kind of assumed that i am inherently against the ban but the case that i always say is like if you support a ban you actually want to be figured out and implemented in the most solid way you should also be extremely disappointed in the way that australia has rolled this out because i think it's a very bizarre thing and i think like the problem that the government had is that ultimately it is a like very very broad flat tool that are trying to deal with this massive problem and then as a result like the implementation is always going to be kind of weird because maybe it wasn't the right tool but very much so it's not the right tool in isolation to deal with the problems that they really want to deal with so i mean the thing that i would say is like seeing like more engagement of like experts who are thinking about how can we not just like kick kids offline but how can we design online spaces and online experiences for young people and for old people as well to have better times online is something that i would like love to see more of and i would say that in particular even if a country is introducing a band to be like that may be one thing but we're going to be thinking about all this other stuff like i really do fundamentally think that big tech has really failed to take care of its users over the last two decades i think in particular young people have been like really, really failed. And I don't trust big tech to implement the best ways to take care of its users without regulation. You know, I think if you have any doubt about that, you should see the way that they rolled out artificial intelligence technologies, repeating many of the same mistakes without many of the same safety features, but they had just learned over doing social media. I think that proves that when push comes to shove, their own continued existence, their own success will always be valued over user safety. And so as a result, like I don't think that means that you can necessarily need to ban them, but you do need to make sure that you are telling them what you demand and you shouldn't be afraid to require them to do things because they are definitely capable of doing a lot of these things. I'd like to see them improved. I think that we should be bold in trying to figure out ways and like a ban is a bold way. I'd like to see other ways as well. I still just fundamentally think at the end of the day, when these users turn 16, they're just going to get turfed into an internet into platforms that haven't really improved. We can't just assume that it's an age thing. We can't just assume that by a couple of years and then everything will be okay. Ideally, the best outcome for Australia in the way that it's been rolled out is that it's not the kids that need the time, but it's us who needs the time to figure out the rest of the stuff that we can do to these platforms. And by the time they come out the other side of this ban, that it is actually a safer platform experience for them, that we will actually see some benefits out of this hold on them being on social media. Yeah, I think that's really well said. And part of me is also like, you know, if they were really serious about getting young people off of social media and offering them some kind of like better alternative, why didn't they spend the past year like doing consultations to develop something that is like for young people? Maybe the ABC, the public broadcaster could have gotten behind it. Who knows? Anyway, there was a whole alternative universe that could have happened here. But anyway, Cam, I always appreciate getting your insights on this. You know, I really enjoy reading your reporting to keep up on what's going on in Australia, because I think you have a really good idea. great perspective on it that's both like critical of the tech companies, but also just not going to give the government a pass on whatever it is they want to do. So I really appreciate you taking the time to speak to me again. Thanks so much. Thanks, Paris. Cam Wilson is an associate editor at Crikey, writes the Sizzle newsletter, and is the co-author of Conspiracy Nation. Tech Won't Save Us has made a partnership with The Nation magazine and is hosted by me, Paris Marks. Production is by Kyla Hewson. Tech Won't Save Us relies on the support of listeners like you to keep providing critical perspectives on the tech industry. You can join hundreds of other supporters by going to patreon.com slash tech won't save us and making a pledge of your own. Thanks for listening and make sure to come back next week.