Galaxy Brain

America’s Slide Toward Simulated Democracy with Eliot Higgins

59 min
Nov 28, 20255 months ago
Listen to Episode
Summary

Eliot Higgins, founder of Bellingcat, discusses how democracies are sliding toward simulated versions where verification, deliberation, and accountability functions become performative rather than substantive. He presents a framework showing how algorithmic social media platforms have replaced institutional gatekeepers, creating disordered discourse that filters people into extreme ideological communities and undermines democratic participation.

Insights
  • Democracy requires three functional pillars—verification (establishing truth), deliberation (debating ideas), and accountability (holding power to account)—which have shifted from institutional control to algorithmic mediation, creating hollow or simulated versions
  • Social media algorithms have democratized cult dynamics by serving personalized content that makes users feel special and recognized while defining identity against enemy outgroups, creating durable ideological entrenchment without a central leader
  • The shift from scarcity of information to scarcity of attention has inverted incentive structures; truth and nuance are algorithmically disadvantaged compared to outrage, making disordered discourse the default state unless actively countered
  • Local journalism and community-embedded institutions build trust through tangibility and accountability in ways national media cannot; rebuilding decentralized networks of verification, deliberation, and accountability at local levels is essential to restoring democratic health
  • The US is moving from performative to disordered democracy (similar to Hungary and Turkey), where electoral rituals persist but systemic corruption prevents meaningful accountability; without intentional reconstruction of functional discourse systems, democratic collapse becomes likely
Trends
Algorithmic radicalization through content filtering creates self-reinforcing ideological communities that function like virtual cults without central leadershipErosion of institutional gatekeeping has created information abundance but attention scarcity, inverting 20th-century media economics and rewiring incentives toward engagement over accuracyRise of 'disordered counterpublics' as political coalitions united by institutional distrust rather than shared positive vision, creating cycles of radicalization and blame-shiftingNationalization of discourse through centralized social platforms has displaced local media ecosystems, reducing community accountability and increasing susceptibility to coordinated manipulationDemocratic backsliding in developed nations follows pattern of performative democracy (elections exist but lack substance) before sliding into simulated democracy (rituals persist but power is unaccountable)Education and media literacy initiatives face organized pushback from those who perceive critical thinking instruction as institutional propaganda rather than skill-buildingDecentralized, locally-rooted networks of journalists, educators, and civil society are emerging as functional alternatives to platform-dependent discourse infrastructureGenerational screen time and device dependency issues extend beyond youth to older demographics, creating vulnerability to algorithmic manipulation across age spectrumPendulum-swing political cycles accelerate as voters lose trust in institutions and swing between opposing sides, with some defecting to more extreme ideologies encountered onlineFunctional counterpublics require friction, disagreement, and accountability mechanisms—not frictionless consensus—to maintain democratic health and prevent institutional capture
Topics
Disordered Discourse FrameworkEpistemic and Democratic CollapseAlgorithmic Content Moderation and RadicalizationVerification, Deliberation, and Accountability FunctionsOpen Source Investigation and GeolocationLocal Journalism and Community TrustMedia Literacy and Critical Thinking EducationInstitutional Gatekeeping vs. Platform DemocratizationCounterpublics and Social MovementsSimulated vs. Substantial DemocracyAttention Economy and Information ScarcityOnline Community Dynamics and Cult FormationInstitutional Legitimacy and Public TrustDecentralized Network Building for DiscourseDemocratic Backsliding in Developed Nations
Companies
Bellingcat
Open source investigative journalism company founded by Eliot Higgins; uses publicly available data and geolocation t...
The Atlantic
Publishes Galaxy Brain podcast; hosts the episode and editorial content; Eliot Higgins discusses work with Atlantic C...
Atlantic Council
Washington-based think tank where Higgins worked on disinformation and Russian interference issues; represents instit...
Something Awful
Early internet comedy forum (25+ years old) where Higgins participated in discussions about Arab Spring; foundational...
Twitter/X
Social media platform discussed as primary space for open source investigation community and discourse; now transitio...
Discord
Platform hosting Bellingcat's community server with 35,000 members for investigations, learning, and peer collaborati...
Jubilee
Video format platform (20x vs 1y videos) criticized as hollow performance of democracy designed for algorithmic engag...
People
Eliot Higgins
Developed disordered discourse framework; founded Bellingcat to verify information using open source methods and geol...
Charlie Wurzel
Hosts the episode; frames discussion around discourse, democracy, and institutional trust; solicits listener input on...
René de Resta
Researcher cited for concept of 'bespoke realities' created by algorithmic filtering and personalization of information
Hank Green
Referenced as previous guest on Galaxy Brain; discussed trust, gatekeeping models, and platform-based information dis...
Noam Chomsky
Author cited by Higgins as influential political reading during his earlier engagement with political theory
Seymour Hersh
Author cited by Higgins as influential political reading; represents institutional investigative journalism tradition
R.K. Jr.
Represents alternative health counterpublic within MAGA coalition; example of disordered discourse coalition member
Tulsi Gabbard
Represents anti-NATO counterpublic within MAGA coalition; example of disordered discourse coalition alignment
Kash Patel
Associated with pizzagate narrative; represents conspiracy theory counterpublic within MAGA coalition
Nigel Farage
UK Reform UK party leader; example of political rejection of establishment institutions in response to disordered dis...
Quotes
"We've managed to build a system that kind of creates virtual cults automatically without us even realizing. So yeah, that's basically what the algorithms have done to us."
Eliot Higgins
"It's designed around capturing the algorithm. And I think it's bad for democracy and pathetic as well."
Eliot Higgins
"Discourse, as we define it here, is essentially just our ability to talk to each other, to find things out about the world, to debate those ideas, and establish ground truths, and reject the things that we don't like. It is our collective sense-making process."
Charlie Wurzel
"The reason why my guest says it's simulated is because when people try to push back against that, when people do try to hold leaders to account, nothing functionally happens, or not enough happens."
Charlie Wurzel
"We need to look at a different way of doing things. That's not to say, you know, full communism now or anything like that. But it's more saying how do you engage the public with the democratic process in a functional way?"
Eliot Higgins
Full Transcript
Redeem your lab bucks on free bet spins or even cash in for real money. That's LADDISFACTION from Lab Brooks. And for extra LADDISFACTION, here's the T's and C's. Let's rock! Hey everybody, it's Charlie. And before we get to today's episode, I had a request for all of you listeners. We're working on a story about screen time. And when we tend to talk about screen time, often the conversation will be focused on younger people. We're worried that they're getting too much screen time, or they've been radicalized by what they see on their devices, or that they don't seem to understand how they're being manipulated. But I've gotten a lot of anecdotal reporting over the last few years that the problem is similar, if not worse, on the other side of the age spectrum. And so we want to do a story about a different generation's relationship to this technology. We'd really love to hear from you. Whether you are somebody who is having some of these problems, or you feel your relationship to your device has become a bit problematic or lopsided, we'd really like to hear from you. Tell us your age and why you feel you have an unhealthy relationship with your device. If you've noticed this with a family member, we also want to hear from you. So if you could send us a brief voice memo about a minute no longer, we'd absolutely love that. Or, than anything, we want you to emphasize and describe what you're seeing, and what you're feeling about your loved one's screen time, or your own. And we want you to express whatever honest amount of concern you have. Please send that voice memo to cwarselattheatlantic.com. That's C-W-A-R-Z-E-L at theatlantic.com. Thank you so much, and here's today's episode. Can you have these 20X vs. 1Y videos, which do this performance? Oh, you mean the Jubilee videos? Yeah, I despise those. I think they're just a strong example of the hollow performance of democracy. That no one's there to learn from each other, to come to a shared understanding. They're there for clips to get attention on social media, and that's, for me, the bottom line of those videos. No one's going there to have their minds changed. It's not designed around that. It's designed around capturing the algorithm. And I think it's bad for democracy and pathetic as well. I'm Charlie Wurzel, and welcome to Galaxy Brain. Today, we're going to talk about discourse, and not the shorthand on the internet for a viral outrage. When we talk about discourse, we're talking about like Cracker Barrel's logo changing and people being up in arms about it, or the latest political infighting. This version of discourse that we're going to talk about in this episode is much more substantial, and I think that the stakes are far higher. It's discourse, as we define it here, is essentially just our ability to talk to each other, to find things out about the world, to debate those ideas, and establish ground truths, and reject the things that we don't like. It is our collective sense-making process. It's how we do science. It's how we develop laws. It's the backbone of a functional and healthy society. And nowadays, when we talk about discourse, we often just refer to it as the discourse is bad. But there are all kinds of discourse. There's a healthy, functional discourse where there are elites that are held to account, where we can debate things, where institutions and people act in good faith and are benevolent. And there's a version of discourse that is kind of hollow, where there are good actors and bad actors, and we kind of just limp along there, even though there are a lot of inequities in the system. And then there's what my guest calls a disordered discourse, where democracy is almost simulated. You have people who get into power, and they wield it by imposing their views on the world, and making it so they can never really be held to account. This is something that I think a lot about today, and now with the Jeffrey Epstein investigation. You have this trove of emails that people are seeing, where you have elites talking behind closed doors and operating with impunity, because they don't feel like they're ever going to be held to account for the amoral or immoral things that they have done. And when that gets found out in the world, people get really, really, really angry. But in this disordered discourse, the reason why my guest says it's simulated is because when people try to push back against that, when people do try to hold leaders to account, nothing functionally happens, or not enough happens. And so you get this incredible frustration. And when people feel like their democratic participation isn't rewarded, they start to tune out or drop out of the system altogether. And it's very, very dangerous. My guest today is Elliot Higgins. He's an investigative journalist who founded the open source company called Bellingcat. They produce journalistic investigations using all kinds of publicly available data online. And Elliot is somebody who is a true internet native and really understands and has gotten into the weeds of all the different online manipulators, nefarious bad actors, and knows these platforms and systems inside and out. And so he's the perfect person to talk about this, not just because he has the experience, but also because he's developed a framework around disordered discourse. And in it, he has this idea that there's basically three conditions that allow societies to function, right? You have to be able to establish truth, debate what matters, and hold the powerful to account. And if you think about those three pillars right now, doesn't really seem like we're doing a great job on a lot of those, you know? It is really harder than ever to establish truth these days. Debating what matters is happening all the time, but it's happening in a very chaotic way, right? We've outsourced a lot of these conversations to these tech platforms that are not neutral, that constantly manipulate us, that drive us to be the worst versions of ourselves, that amplify outrage. We are operating in what the researcher René de Resta calls these bespoke realities. And so it leads to a discourse that is so disordered that it really threatens democratic collapse. And so Eliot Higgins is going to walk us through this framework and try to ground us a little bit, describe the temperature of the water that we are all swimming in all the time, and help us try to figure out, if at all, how we can claw it back. So here's Eliot Higgins. But first, a quick break. So, your excellent idea stays yours. Do that with Acrobat. Learn more and try it out on adobe.com. Eliot, welcome to Galaxy Brain. Thanks for having me, El. Yeah, absolutely. I wanted to start with your background and specifically something that you posted on Twitter, back when it was Twitter, where you talked about how the research work that you did that turned into Bellingcat, it started with, and I'm going to quote you here, me arguing with people on the Guardian Live, middle live blog comments, posting way too much on the Something Awful forums. During those arguments in 2011, there were videos shared from Libya, arguments about their authenticity. That's when I figured out you could use satellite imagery to figure out where these videos were filmed, stumbling into geolocation. I want to talk a little bit about how you fell into this work, and a little bit to set the stage here. What it was like to realize the depth and the breadth of all this publicly available information on the internet. Yeah, well, when I started, I was really just an orderly internet user in the early 2000s, and then becoming part of these online communities. There's something awful forum, which if you listeners don't know is a very old internet comedy forum that's been around, I think now for about 25 years, even longer. It's where a lot of this kind of meme culture originated along with a number of websites in the early internet. But it was also a place where they had a really actually quite good community of people who were taking part in discussions about in 2011 and 2010, what was happening in the Arab Spring. And I was involved with those discussions. But I just was frustrated with what I was seeing in the reporting in the media that you had all this kind of video footage being shared online, you know, people were getting smartphones and sharing stuff on social media. And it was being ignored by the mainstream media for the most part. And there were good reasons for that. I mean, questions of verification, for example, that they were from quite infamous stories early on about how the media were tricked by accounts like a blogger called Gay Girl in Damascus, who turned out to be a white guy. And they got very cautious about this online information coming from these Arab Spring countries. But I felt like there was something there that was useful. And I also really wanted to beat people in internet arguments because I was quite petty like that. So, so like every day, and you really see the factions appear on these internet forums, so the people saying things like, oh, Gaddafi is actually an okay guy. And it's, you know, NATO interference. And other people saying, oh, it's terrible. But I just wanted to know what was going on. So this is where I came up with these ideas of using satellite imagery to compare it to video footage coming from these conflict zones to confirm exactly where they were filmed. And that's something we call geolocation now. But that was like something we didn't really have a name for. I mean, I didn't have a name for it was like adults, what the difference. So really, that's where I started kind of using that to win arguments on the internet. But I just found more and more interesting things in these videos. And that then turned to me starting in early 2012, a blog called the Brown Moses blog named after Frank Zappersong, where I was it is more hobby for my own interests. But, you know, serious people took interest in our journalists contacted me about videos I was writing about. And it kind of grew from them in 2049, launched Ballincat. But that was really found on the idea that we do the investigations, people, you know, from the public can do investigations of open source evidence. That's what's so powerful about it. But also, it's therefore valuable to teach people how to do it. So it was both the investigations and guides and guides and case studies for anyone who wanted to do it themselves. And I think through, you know, now 11 years of Ballincat, that it's always been about not just the investigations, but how you spread those ideas and techniques to the public to traditional institutions into new media. Have you always been someone who has been I mean, that there's sort of like an experimental lens to this, right? Have you always been a person who has like poked around on the internet, who's always been trying to see around those different corners? I grew up in a really interesting time here at History in the UK, I think in terms of technology, we had, you know, my first computer when I was probably three or four years old was the Spectrum 48K, which was an old tape based computer system that became really the whole foundation for the entire UK games industry. You know, it kind of put the UK on the map in terms of technology. And I love technology. My favorite TV program when I was like seven or eight was a BBC One program called Tomorrow's World about the technology of the future. And the internet for me, when I heard about it was the most exciting thing, you know, you could possibly imagine. So I was a very early internet user. I mean, I was probably on like Compreserve when that was being distributed by Maxi's. I'm aware by saying Compreserve, there's a large amount of audience you probably have no idea what that is. It was an on, I feel like there's a whole, yeah, online service providers is a whole thing, basically very early internet. And then old school bonafide. Yeah. Yeah. So I was always, you know, looking for interesting stuff on the internet. And, you know, forums like the something awful forums played a part of that because that was full of people digging out the weirdest stuff on the internet to show each other and laugh about. And, you know, then I think for my political journey, I was always quite interested in kind of alternative stuff, music, politics. But then with the invasion of Iraq in 2003, and how I saw that was like, you know, so much protest against it, you know, clear lies being told, and it still happened for several years, I actually really switched off politics and kind of really disengaged from it when I used to be someone who read a lot about politics. I read loads of things like Noam Chonsky and Seymour Hirsch and a lot of writers who now seem to hate me. Sorry, it's kind of a bit weird. You made it. Yeah, I made it. And so, yeah, that was kind of my whole journey with this. And then the Arab strings happened. So I was on very online seeing these videos. And also it became almost like a puzzle for me to solve so I could understand what was going on in these conflicts. So going back really quickly before, I think it's important because we're essentially here to talk about disordered discourse. And something awful for those, because it's been mentioned a few times, is it's a comedy website that had these forums have very popular in the mid to late 2000s to 2010s. And as people have argued, the site was genuinely remarkably influential in birthing like a lot of like the dominant internet culture of the era. Like every, you know, I think I've seen lists of things that say, you know, everything from like law cats to sort of the standard meme templates to like four champs. So I think there's something foundational like just being just having time in that part of the internet and especially that forum. Does it inform this work that we're about to talk about in some way? Like, do you feel like that is foundational to have come from places where a lot of this dominant internet culture sprouted out of? Yeah, I think so. I mean, as I started my work with Balancant and my earlier blog, I could always already see these kind of communities forming around the kind of stuff I was looking into that were like the denarthest communities. But I kind of understood weird online communities from the something awful forums, because there'd always be some drama between something awful and some other forum or online community. And they were always kind of like strange, you know, at the time, seeming communities to those online cultures. Now they're all over the place. But at the time, because the internet was new, all these things were strange and new. And it was just, I think that was part of what helped me. I used to do in 2015, at about 2017, work with the Atlantic Council, which is a US Washington based think tank, and meeting a lot of people there working on issues around disinformation and you know, what rushes up to the internet. And I always thought it came to me, these are really well educated intelligent people. But the thing is, when they were learning to be educated about all this really important stuff, there are a bunch of people who spent far too much time on the internet learning about that culture. And so we have these very intelligent people applying kind of logic, you know, what they've learned to this new environment. And there was just this dissonance, they couldn't kind of breach with that, because it's like they were making assumptions that always seemed flawed to me. And part of the work I did on this sort of discourse was really part of that is that there was a lot of focus on Russian disinformation. But I think there was too much focus on Russian disinformation in terms of the totality of what's going on. So we've kind of missed the forest of the trees. And that's why I wanted to kind of step back and say, okay, actually, break this whole thing down. Why is there so much disinformation? Why do people seem not to exist in the same reality as each other anymore? And that started, first of all, by creating this disorder discourse framework, which is really about how communities you believe in these fictions start forming in the first place. And so I mean, that's, you set it up perfectly. I want to, okay, so I want to talk about this framework and what you've called this like epistemic and democratic collapse, or at least this trajectory of how it happens, how we assess the health of a given democracy, etc. To begin with, you lay out that democracy basically rests on three functional minimums. Can you say what those are? Sure. So if you look back through political philosophy and academia over the last century, and even beyond, it always comes back to kind of three core ideas in my opinion, that without, you can't really have a functioning democracy. The first is the idea of verification, that as a democracy, we try to figure out what the truth is, because then we can deliberate on that truth. That's the second part of the deliberation, where we create spaces where we, in a pluralistic manner, can bring people together to deliberate over the facts that we can establish through that process of verification. And finally, there is accountability. So action is taking on those things and those who hold power can be held to account. Now I will stress, these are kind of functional ideals. No dogmocracy actually ever can truly achieve them, because by the very nature of democracy, you can never have 100% happy people on any topic at any one time. So it kind of starts with the idea that these verification, deliberation, and accountability have to exist, because without verification, how can you deliberate on reality? And if you can't do that, you can't really have true accountability. But I expand on that saying that you need to start thinking about this as free types of verification, deliberation, and accountability in democracy. They're substantial, which means you have functional democracy and functional systems. There is hollow or performative, which creates these hollow systems. And then there are simulated verification, deliberation, and accountability functions, which occur in this sort of discourse. And that's kind of basically the initial start of the framework. I love some of these terms, because they're very, they're very immediately evocative for me. When I hear about, I mean, I don't know if I have a ton of lived experience in the substantial part of this, which I think is part of the problem that we're going to get to. But this idea of a hollowed out or sort of performative type feeling, when I think of also just discourse on the internet, I think of both those words that you use, performative and simulated, right? These ideas that it sort of doesn't matter, in a sense, what anyone does in this system, it's going to sort of churn out things the exact way. And so my question here is, as you were coming up with this, where do you think, because you call this the arc of democracy in some way, right? Where do you think maybe using a America as an example, where do you feel like we are right now on that arc? In terms of the US, very far into disordered. And it's a mix of these things, I would say, but if you imagine it as the arc, you are moving from performative well into disordered. And when you head towards disordered democracies, that's when you basically start seeing democracies turn into these compressive authoritarian governments like you have in Hungary, like you have in Turkey, where you still have the kind of rituals of democracy, you still get to vote, for example, but the system is so corrupted and skewed towards the autocrats, that it doesn't really count. And you see that happening in Turkey, for example, you see that happening in Hungary. And that is where I think America is heading towards. I think the recent election results might seem like a piece of relief. But my problem is, there's one thing to get another president in, get rid of Trump, win the midterms, but then you have to create that substantial verification, deliberation, accountability functions in that democracy. And that's a much, much bigger task. And my fear is that what keeps happening is we just have this pendulum swing back and forth, where people are just losing trust in one side, so they go to the other side, they lose trust in that, they swim back in the other direction, but some of them also swing in the direction of more extreme ideologies they're encountering online. So yeah, it's not a great situation to be in. It's really the frank answer. And what I find the most worrying is in this model, that's almost the default state for a number of reasons. What do you mean by that? So if you look at what are the incentives for the creation of these discourses. And let's just take it back a bit. So these verification, deliberation, and accountability functions were done by institutions in the 20th century. So you had, for example, verification was done through newspapers, editorial processes, deliberation in Parliament, so that accountability happened in courts, for example. And that obviously had mixed success. One thing that's very important there is the idea of these things called counter publics, which are public movements that form around perceived injustice, like the civil rights movement, the feminism movement. And they change democracy through these same functions, they influence them. And it can be a battle sometimes, but it's kind of like those things have to exist to allow democracy to evolve. And the challenge that we're facing at the moment is that we've lost a lot of that. And it's been replaced by these online spaces where it's no longer about the truth getting through to people through institutions, it's about a kind of free for all for the most valuable thing to the algorithm, which is attention. So everyone is shaping the behavior around getting attention online, because that's the only way you can be visible in the information system, the whole media infrastructure is kind of shifting towards this attention based system, rather than one that's coming through these institutions. And the problem is that means the functions of verification, for example, are done by you and I, we see a post on social media, we're now the distributor of that information. And that changes everything, because it's no longer about scarcity of information and a lot of attention as we had in the 20th century. Now there's so much information, but there's a scarcity of attention because of that. And that dynamic is poisonous. This is something in a previous episode, I talked with Hank Green, a popular YouTuber, and we talked a little bit about trust and about the shift as you put it, basically from a gatekeeping model to a platform based model, this idea of sort of a democratic system of information that ends up being very disordered. And one thing that we were trying to tease out in that conversation a little was, I think when you think about the trust in institutions, so many people who are competing for this now, competing for that attention, malign these institutions, it's very helpful to do and to say, well, the real reason there's a lack of trust in the media or lack of trust in government or public health or you name it, is that these things have failed in some way. And what's sort of interesting about the framework, and I'm not asking you to absolve institutions or render a judgment, but it seems a little bit like the shift in the information system, in the way that it is distributed, in the way that we're all vying for attention. It seems like that is a bigger force and factor in this than the institutions simply tripped and fell and now are facing the, having to deal with that. How do you think about that? Do you feel like the institutions, it's basically just like it's not a fair fight in this new environment? Or do you feel like this is actually also a reaction to frustrations against elites, frustrations against institutions that are legitimate? I would see it as a convergence of both of those factors and other factors as well. They aren't competing, they're part of the same problem. And I think this is often where we kind of fall down when we're thinking about, for example, disinformation. We see it as a separate problem from other issues. So it's kind of like we see it as a problem of information. But this is kind of where I started with my framework. It was a problem of discourse is the information actually being able to reach people in a way that's most functional, rather than what starts happening in these disorder systems where they don't reject the idea of verifying information, deliberating, and accountability. They kind of pervert it through simulation. And then the internet is also brilliant at filtering you to the most extreme version of beliefs you can hold while also reinforcing them. Because as it's showing you that material, it will always include in that subset, if eventually once you see enough of it, it's more extreme. And then you might click on it. And then that serves you even more extreme content. And it filters towards those communities who hold those views, who then provide you with all the information you need and the community to reinforce that. So, you know, it's a convergence of that. But the distrust is a big part of this as well. I think also we're living through a period where I think the growth of the 80s and 90s that a lot of people saw reflected in kind of meritocracy has just fallen away because everyone's working really hard. And, you know, my generation, in particular younger generations don't seem to be, you know, having the successes of their parents in many cases. And people recognize that. And that's, you know, just part of the system where, you know, because neoliberalism has been, neoliberalism has been such a priority over the last 40 years, you've had places like, you know, union halls, public spaces, you know, places where people could meet and actually, you know, deliberate and actually form these counterpublics disappear. And instead, that's in the nonline spaces where everyone's just mad at each other all the time because of the algorithmic recommendations they're getting. And that does not help democracy at all. It kind of undermines the entire thing, really. Well, yeah, I mean, what what you're describing is this perfect storm, right? And I think what is is really important in in what you said there is this idea of community, right? I think all the time, I mean, I try to assess this in my own mind when I'm consuming information and scrolling and thinking and trying to, you know, express my opinions or figure out what I'm going to write about or do. And I think like it becomes this this hard thing for people to see, but like the ways in which humans are wired to want to be in community with other people, to want that acceptance, to want that, you know, that those bonds and then the norms that those communities create, which are enforced by all this discourse and that and all these these information systems that we're all plugged into. And the the fundamental, like emotional and psychological pain of breaking from those things, right? Of saying something and being ostracized for that. I mean, it feels to me like it's so easy to talk about this stuff in a really simple or it sounds very simple when you just say, oh, if you say the wrong thing, you will be, you know, you will be canceled or ostracized or whatever. And we make we make a lot of hay out of that. But like, psychologically, when you get yourself into this position where these are your people, right? When you get into these groups, you're obviously facing a lot of pressure from outside people. You kind of you hunker down. But if you do try to stray from that, there is this like psychological pain and tax that, you know, these communities will exert on you if you break and it feels like it's just that's one of the things that just like hardens this, right? That like creates that creates a more durable ideology. Yeah, because even it doesn't matter if they admit it to themselves or not, they deeply understand that, you know, what turning away from that community means for them. Because if you hold extreme beliefs on pretty much anything, you aren't going to have many people in the real world who agree with you. And often your family members will start having, you know, bad relationships with you. I mean, if you have really extreme beliefs around things like anti-maxing, for example, where people do have a strong opinion on that on the other side of the argument, you could find yourself really with your only friends being that online community who agrees with you about all the things that you think, and where you actually get recognition for that. And that's something that's really important because people don't feel recognized in the real world spaces they're in, because their family doesn't talk to them and all their work colleagues think they're the weird one in the office, they go online and they're a hero. So that is very, very appealing to people on that level. But it's also the fact they kind of define themselves not just in terms of the group they belong to, but the people outside that group. They aren't just many other people, they are the enemy or idiots. And that would be kind of admitting if those people are right, that maybe you're an idiot, or maybe they weren't an idiot or a log, and that creates that dynamic as well. So it's really the cause can get really deeply into you. And it's a lot actually like how cults are formed and controlled. We've managed to build a system that kind of creates virtual cults automatically without us even realizing. So yeah, that's basically what the algorithms have done to us. That's sort of a, that's such a striking takeaway that we've just, we've democratized the cult leader or the cult dynamic for every single person. You just need an algorithm. It's as civil as that now. You don't need some strong leader because you don't need someone telling you what you're right and how special you are. You've got an algorithm just serve you content that makes you think that. So what can be difficult in talking about this is essentially as your framework goes, this is basically, this is like the air we breathe, right? Like this is embedded in everything, in every conversation that we have, in every debate that we have, and the way that every institution is trying to claw onto their legitimacy. Because you're going to be thinking after you've done this, which clips will go the most viral and what is going to get this attention because you are as much part of this economy and you can't escape. It's not something you can opt out of because you kind of then cease to exist. You know, if you work in the media, that's a big, big problem. So everyone is, even me, I'm on blue sky, writing these threads, thinking what's going to get the most engagement on these threads. But underneath that, I, because I have this understanding of this model, I want to do it in a way that's functional. That's part of, you know, creating good information. And that's always really been what Bellingcat has been about, our investigations add information that allows people to understand, you know, deliberate and, you know, hopefully bring about accountability on a range of topics. And I realized doing this work, that Bellingcat was a functional counter public because it had formed in a reaction to, you know, not the biggest injustice in the world, but the way in which the media was ignoring this content from these conflict zones. And it was designed around the idea of verifying that and we create spaces for deliberation. So, you know, X and Twitter really was really a big part of the open source community. But as things have changed, we've created a Discord server that has about 35,000 members and investigations come from that. People learn how to do investigations and, you know, learn from each other in that space. So creating those functional spaces is also something that we do with Bellingcat. And then finally, you know, we do accountability as well on the work we're doing, you know, on a range of topics. Well, and two, you know, I also, I feel like with Bellingcat, there's this idea, right, of do your own research. And that has become this, you know, obviously a shorthand for, you know, take your sort of amateur lens with, you know, maybe you don't have all the right information to process what you're seeing, but go do it anyway, form your own conclusions. And in this way, I mean, in a very real way, Bellingcat is a way of doing your own research through, you know, a system that has, you know, more guardrails. You are, you are basically the, at least in my mind, the good guy version of the do your own research. Well, you can see it in terms of the framework. So, you know, we can see ourselves as doing kind of a functional version of that, but there's a disordered form of that, where people are doing their own research, but they're doing it through this moral epistemic lens that means they're eliminating certain sources as being trustworthy because they disagree with their worldview of the system they're already part of. Then we also have hollow stuff like that as well. I mean, there's, you know, it's like, in terms of deliberation, you have these, you know, 20x versus 1y videos, which kind of do this performance. Oh, you mean the Jubilee videos? Yeah, those, I despise those. I think they're just a strong example of that kind of hollow performance of democracy that no one's there to learn from each other, to come to a shared understanding. They're there for clips to get attention on social media. And that's, for me, the bottom line of those videos. No one's going there to have their minds changed. It's not designed around that. It's designed around capturing the algorithm. And I think it's, you know, bad for democracy and pathetic as well. What would a healthy, you know, a functional Jubilee video look like to you? I mean, the concept of them and how they're set up, I mean, that's just, unfortunately, it would look boring to most people. I think this is the problem. That's the point too, right? Is that it would be so substantial that it would appear boring next to the rest of the content. And this is the challenge. You've got two or three seconds to grab someone's attention. And if you've got, you know, some idiot scowling at some other radio on social media, that'll capture someone's attention most of the time, whether they just draw a line through the video, so it catches your attention, whatever new ideas someone comes up with to irritate you enough to stop scrolling. But it's not about building a better democracy, it's just about content that gets attention on the algorithm. Right. So how big, and this is maybe a silly question, given what we've just talked about, how big a crisis is this disorder discourse? Like, how do you capture the scale of it? How do you conceptualize the size and the stakes of all this? It's, I think, something that is best measured in the effects it has in society. And I would say what's happened in the last nine months in the US is a really strong warning to everyone else, which way it can go. I look at the MAGA movement as a coalition of disorder counterpublics. So if you look at, for example, the Alternative Health Counterpublic, which is, you know, R.K. Jr., and that's what he represents, you have the kind of anti-nato kind of chemical weapon denialist community, which Tulsa Gabbard represents, you have the pizza gate hat that Kash Patel represents, it's not one kind of unified group with the same shared ideas, it's just that they have alignment around, you know, the concept that Trump's going to give them what they want and that they all distrust institutions and they're going to change those institutions. But that's a really big problem because it creates a cycle where these people, they look for problems that don't exist, that they truly believe exist. Maybe some of them are grifters, some are true believers, and they go after that problem and they can't really solve it. So when they fail to solve it, they blame the outsiders that, you know, they say, it's the woke media or whatever it is, it's liberals. And then they further kind of radicalize their own viewpoints against that outgroup. And I think we're still very early on in that process, but you know, when that starts happening in the US, I think that will be a real moment of seeing how bad things have got full democracy in the US. I think it's nice, everyone can pat themselves on the back on the good election results recently, but unless they build something beyond that, we're just going to fall back into this kind of swamp of algorithmically mediated information that I feel it's really, really difficult to escape from. Well, and there's this way to, you know, looking at election results, I mean, there is, there is a, there's unending conclusions one can, one can draw, which is why it's, it's sometimes difficult to parse. But I think one thing that you're also seeing that applies to this framework that you have is especially in, you know, in America, a kind of a ping-ponging back and forth, right? A rejection of whoever seems to be holding power at this current moment, right? You have Trump into Trump's handling of the pandemic, certain things, okay, Biden, we need someone, okay, you know, all the reaction to that Trump back in, you know, you have this sort of wave of, of, of rejection of, honestly, like an institution, right? The soon as you can kind of gain control of the institution, there seems to be sort of this rejection of it. And that doesn't seem to me, you know, again, that may be too tidy of a, of a, of a bow to place on it, given what is also happening concurrently with the MAGA coalition in America. But I also think that it sort of speaks to this dynamic, right? Is that there is this, it is extremely hard once you grab a little bit of authority now to face this information system that is essentially geared towards, you know, undermining that authority. I mean, does that, does that feel right to you? Yeah. So you just end up, I think the US is fairly unique in terms of having the two-party system. So you have that ping pong effect. It's like in the UK, for example, Reform UK, which is a very minor party, despite the coverage it gets because of Nigel Farage, is now leading in the polls, which is, it seems like really bad news, but that seems more like a rejection of both the Conservative Party, the previous government and the new Labour government, who's been doing pretty badly to begin with. So it is a reaction against something, but that's the problem. Our politics are becoming, I'm reacting, reacting against something because they've let me down. And we're surrounded by a system that will continue to remind us how rubbish things are, because saying everything's great does not get you on the algorithm. Saying everything's rubbish, here's some violence, here's some sex, here's some just really bad news. That's what gets people looking for you stuff on social media. Not everything's sunshine and rainbows, unfortunately. And again, it comes back to these algorithmic incentives that are being created for people. And it's just like this is the kind of dragging down of the swamp effects. Once you're in there, once the claws are in you through social media, it starts dragging down the whole democracy with it. We have this diagnostic framework here that you've come up with. And then we have this idea, especially with democracy and also with institutions, that there is this old system trying to cling on, trying to sort of work in the pre-algorithmic world, still trapped in that zone. And now we need democracy to adapt to this system. What do we do? Help us out here. What do we do? It's a big job. First of all, there needs to be a recognition that the fundamental information system we exist in has changed dramatically. And it's really important to understand that institutions no longer dominate that verification, deliberation, accountability process. They also dominated what voices were allowed to be heard. But now any voice can be heard. And they get heard if they do something to engage the algorithm, which usually is not something that's true for it, something that is engaging. So we have to understand that's the system that we live in. Now, the question we have then is, you know, do our legislators really have any interest in stopping social media algorithms? And I think in this particular time period, probably, or 11 months now, probably not. And this means, okay, so we have to look at kind of alternative action. I think it has to come from the grassroots. It has come from, you know, the public, because again, the institutions ain't what they used to be. And we need to look at a different way of doing things. That's not to say, you know, full communism now or anything like that. But it's more saying how do you engage the public with the democratic process in a functional way? And there's a whole variety of ways of doing that. So I describe something called the ARC framework, which describes eight tracks of activity. And these tracks include things like education. For example, at Bellingcat, we've been working with schools on a pilot program to teach them critical thinking skills, just to kind of see how that works in school. And there's a huge amount of interest in it. We work with universities to create open source investigation courses that students learn how to do the investigations. But also at those universities, we have investigative hubs. So those students can do their own investigations without direct intervention of Bellingcat, but also connect to their local communities. Because I think local media has to play a really big role in this as well. So we have the education aspect of it that can connect to the kind of media side of it. How do we actually get this stuff out there? I just want to pause on that for a second, because that to me, I've talked with academics and researchers around this idea of education. And it's obviously like, it's very important from the media literacy portion of it all the way up to what you're talking about in terms of research methods, giving people the ability to actually go and do that, that work and understand that every time I write about that or report on that, it is met with such harassment. And I am that was such harassment and fervor from other people on the right. It's very clearly, it shakes propagandists to their core. They get extremely reactive about this. And I wonder, do you get a lot of pushback there? The educational part of it feels like it is almost like the most crucial building block to a path that is a little more healthy from a information perspective. It also seems like the most fraught in terms of people all of a sudden getting very suspicious of, what are you teaching? Okay, you're teaching this critical thinking. Is this critical thinking, to blindly trust institutions or what have you? Are you seeing that kind of pushback when you try to implement some of these programs or ways of thinking? Not when we implement them, but when certain people hear about them, their framing is more that, oh, they're going to go into schools and teach young people propaganda against the right or whatever, or on behalf of the government. They don't see us teaching skills, they see us teaching information propaganda, which is absolutely not what we're about. I mean, the whole point of what we're trying to do is recognize that we no longer have this relationship with institutions where we rely on them to verify the information that comes to us. And that really was the reality of the 20th century through the media. I mean, you could buy a book, but who decided to write the book? Who published the book? So now we have this free fall of information, we need to recognize that we live in that environment. And to navigate that environment, media studies, critical thinking skills are not optional anymore. When I think about the real demise of local news, I mean, it's so much here in America, obviously it's an issue globally. And then the nationalization of news. I've lived recently in a couple of small towns. And you can really see how that local model is just a virtuous cycle of trust building, right? This idea of the people who do the reporting and the writing and the verification and the holding people to account. There is also this way in which the community does that with them, right? Like they are the, let's say your local columnist or your local investigative consumer reporter kind of person is also a parent in the schools. You see them on the sidelines, you know, in the sporting events and things like that. There's this way in which they're all intertwined. The community can verify, hold them to account, well, they hold the community to account. And there's this real notion of like, of, you know, tangibility, I think, with all of it, that builds trust, right? This is not some abstract person. When you get rid of that and you nationalize it, and that's part of what these information systems have done too, right? They have nationalized our conversation so much. Like, I know more about the mayor-elect of New York City than I know about any politician truly within 500 miles of my home. There's something that's a little mind-boggling about that too. But I feel like the nationalization of that, what it incentivizes is people dropping in to your community, right? Someone from a magazine or whatever organization comes in when something happens, right? I think about it as a report, a national reporter myself. I've had to go into communities after, say, school shootings. And I don't know a single person in this community. I don't know the norms of this. I'm trying to do my best. But it's ultimately an experience that alienates almost in both directions and can cause that distrust. And I think that that, you know, I think of all of this, of how it creates that cycle, how it adds to the disordered discourse, right? Because we aren't speaking from that position. And I think this, like the localization of all of this is the community element of this. It sounds, I think, to a lot of people very like pie in the sky, right? But re-establishing those networks, I feel like, is like the crucial first step node, right? In restoring trust and saying, okay, maybe I don't trust all journalists. But I definitely trust the ones in my community, because I'm watching them work. I'm seeing the impact of that. I think some people look at that. And yeah, they do say, how is that possible? But one of the things that I found really interesting working on the R framework, we have these eight directions of activity. And you'll find there's people in different kind of silos, you know, there'll be journalists, there'll be people working in education, some civil society organization who never talk to each other, never know each other exists. Yet they're actually working on something in that same direction that overlaps really nicely. So by having that, we can start, rather than thinking about, you know, oh, I'm an NGO, I'm a journalist, we can start saying, okay, I'm interested in this stuff, I'm doing stuff here, and we can look at ways to collaborate on that. Because finding those collaborations is just about a kind of nice way to do a bit of extra work. It's about also building relationships between people in those collaborations. And it's almost as if we have these online spaces that allow us to connect internationally and nationally. But we've got to connect those to the real world spaces that, you know, are in these local areas, like through these university hubs, almost as like, you know, that there are nodes in a network that we're trying to build, that it's not a node where the network is around one center, it's very decentralized. And that's, you know, I think really cool to this as well. Because as soon as it starts to feel like that network is owned by someone, then that's when people start losing trust in it. I think that's right. One thing that I'm getting from all of this, and reading the work and talking to you, is that you might think, right, with this idea of trying to reorder or restore some order to our discourse, that that would be a call for the removal of friction, the removal of disagreement, the removal of right, because right now we exist in a system that when you're experiencing it, when you are participating on Twitter, X, whatever, Blue Sky, what name your social network, right, or reading the news or watching politics in action, there's so much of that friction and disagreement and pain and psychological pain and and dunking, whatever. But what I'm getting from you actually is an insertion of friction, of disagreement, like this whole idea of functional democracies needing these counterpublics, right, these are like holding people to account, is putting friction in the system. Do you think of it that way? Of this is kind of actually adding friction is the way that we sort of get a more ordered, a little more seamless discourse. Yeah, I mean, it's simple that as saying that, you know, if you need a functional democracy, you need functional counterpublics. And the problem is we aren't seeing those forming in this current information system because of the incentives that these platforms create. So the question is how do you start creating those healthy, health account of publics, before it's impossible to even do so because institutions become captured by that disorder discourse. And for me, I feel like we're staring that in the face in the US at the moment, we're so close to a tipping point. And I think Americans really need to realize that, you know, they are a very short way away from becoming just as bad as Turkey or Hungary and just as hopeless when it comes to restoring democracy in those countries. I feel like listening to all this, what feels so insidious is that these tools that have really helped create this disordered discourse that you speak about, this democratic collapse of sorts have been sold as these ultimately democratic tools, these tools that give so many, give voices to so many people that get rid of these gatekeepers that allow for what you would think is this democratic flourishing. And it feels to me so, it's so difficult for me to grapple with that, I think, because it puts people in a difficult position, I think, from any kind of institution, from any kind of former authority position to say this explosion of voices and perspectives and things like that, you know, is actually leading to a lot of chaos and we do need some order in the system. How do you think about messaging that to people in a way, like, you know, talking about this in a way that allows, you know, doesn't erode the trust, doesn't make people roll their eyes and talks about this idea of trying to balance this, you know, democratizing of opinion, but also to do it in a way where there is some sort of order and structure and isn't, you know, eroding the good parts of the systems that we have. And I think, unfortunately, we live with a lot of good examples of what happens when this runs out of control. And those are becoming more and more apparent to everyone. I think, you know, again, the US is unfortunately a really good example of this at the moment. But one thing I found presenting this work is a lot of people feel, have kind of felt this problem, but they've not been able to conceptualize it in a way, that they can, well, I explain this to them in using this framework, it kind of all comes together in a way that they can understand. And that's where we, I think we need to start with a story. We can't just saying, you know, there's too many people believing they're wrong things. This is how we tell people how to believe the right things, because that's not how it's going to work. It's going to be about how to actually improve, you know, the whole system that people can feel empowered to actually do this stuff, that they see there's a value in doing this stuff, because we can, you know, have the most functional verification and deliberation in the world, but if there's no real accountability, it's still going to start moving towards those performative and disordered forms. We need there to be not just, you know, the stuff that the public is doing, but for real accountability to happen. And I think we're getting very, very close to a point where if we don't get a handle of this, it's going to start dominating our politics, our democracies, until we don't really have democracy anymore, both in the US and the rest of Europe. I think that is a sobering place to leave it. I thank you so much for your work. I do have one question. We asked Hank Green this because he is someone who thinks about attention and understands the media and we joked about it earlier. Give me a potential headline for this YouTube. Oh my God. What are we going to do to attract maximum people? The most engaging thing. We can do. We're going to be transparent about our process. You're going to be something you have to say something's definitely been completely destroyed in his hair, fireways, destruction of democracy. Good if you use the same deed, you know, the desperate destruction of democracy or something like that. That actually sounds like a quest from out of the world too. No, something less cheesy, but you know, you need something that's really punchy. It's definitely the end of the world. You've got to get people listening because unfortunately that's the attention driven economy we live in. All right. We will recklessly ramp up the stakes of this. Now, I think the stakes couldn't be higher. I think your work truly couldn't be more important to getting us all to understand and better, like describing the temperature of the water in the aquarium that we're all swimming in and it's getting hot. So, Elliott Higgins, thank you for all your work and for coming on Gallagher. That's great. Thanks for having me. Another morning, another reminder there's a gap to be careful of, but maybe it's time to bridge the one between your nine to five and your dream of living life on your own terms. At HSBC, we know ambition looks different to everyone. Whether it's retiring early or leaving more for your family, we can help because when it comes to unlocking your money's potential, we know wealth. Search HSBC Wealth Today, HSBC UK, opening up a world of opportunity. HSBC UK current account holders only. That's it for us here. Thank you again to my guest, Elliott Higgins. If you like what you saw or heard here, new episodes of Galaxy Brain drop every Friday. You can subscribe to the Atlantic's YouTube channel or on Apple or Spotify or wherever you get your podcasts. And if you'd like to support my work or the work of the rest of the journalists at the publication, you can subscribe to The Atlantic at theatlantic.com slash listener. That's theatlantic.com slash listener. Thanks so much and I'll see you on the internet. This episode of Galaxy Brain was produced by Nathaniel Frum and edited by Claudine Abade. It was engineered by Dave Grine. Our theme music is by Rob Smersiak. Claudine Abade is the executive producer of Atlantic Audio and Andrea Valdez is our managing editor.