Science Vs

AI: Is It Ruining the Environment?

38 min
Nov 13, 20255 months ago
Listen to Episode
Summary

This episode examines whether AI is actually ruining the environment by investigating claims about energy and water consumption. Researchers from MIT Technology Review measured real energy usage and found individual AI queries use far less power than viral memes suggest, though aggregate consumption is rising significantly as AI adoption scales across institutions.

Insights
  • Individual AI queries (text, image, video) use modest energy—equivalent to 1-10 seconds of microwave time—contradicting viral claims that a single email uses a full water bottle
  • The real environmental problem isn't AI specifically but fossil fuel-dependent power grids; renewable energy infrastructure is the critical bottleneck, not AI efficiency
  • Water consumption by AI data centers (0.3% of US water supply) is modest and regional; the larger water issue is power plant cooling, which would improve with renewable energy adoption
  • Tech companies are deliberately avoiding transparency on energy metrics, forcing independent researchers to measure open-source models instead of proprietary systems
  • Public skepticism about AI's value (61% of Americans see more drawbacks than benefits) makes environmental impact feel less justified compared to accepted activities like flying or eating meat
Trends
Data center electricity consumption tripled from 2014-2023; AI data centers projected to use 25% of US household electricity by 2028Tech companies pivoting to nuclear and gas-powered generators rather than waiting for renewable grid expansion, indicating short-term energy strategy misalignmentGrowing local opposition to data center construction based on resource competition concerns, particularly in water-stressed regionsIndustry shift toward model efficiency optimization (parameter reduction) to lower per-query energy costs as scale increasesEmergence of independent research initiatives to measure AI environmental impact due to corporate transparency gapsIncreasing integration of AI across institutions (78% of organizations now using AI) driving aggregate consumption growth despite modest per-use metricsPublic discourse conflating individual AI use guilt with systemic energy infrastructure failures, creating perception problem for AI adoption
Topics
AI Energy Consumption MeasurementData Center Power RequirementsGPU vs CPU Energy EfficiencyWater Usage in AI InfrastructureFossil Fuel Dependency of Power GridsRenewable Energy TransitionLarge Language Model ParametersAI Model Efficiency OptimizationCorporate Environmental TransparencyVideo Generation Energy CostsData Center Cooling SystemsRegional Water Infrastructure CapacityAI Adoption in OrganizationsEnvironmental Impact Memes and MisinformationNuclear Energy for Data Centers
Companies
OpenAI
Receives 2.5 billion prompts daily; disclosed text query energy use (1-2 microwave seconds); declined detailed transp...
Google
Major AI provider; contacted for environmental impact information but did not respond by episode deadline
Anthropic
AI company contacted for environmental impact disclosure; did not respond by episode deadline
xAI
Deployed gas-burning generators to power Tennessee data center rather than waiting for renewable grid expansion
Mythbusters
YouTube demonstration comparing CPU vs GPU processing power using paintball gun analogy referenced in episode
People
James O'Donnell
Senior reporter for AI at MIT Technology Review; led independent research measuring actual energy use of AI queries
Casey Cronhart
Senior climate reporter at MIT Technology Review; co-authored research on AI energy consumption and grid implications
Shalei Ren
UC Riverside computer engineering professor focused on sustainability; published viral research on AI water consumption
Rose Rimmler
Senior producer at Science Vs; hosted episode and conducted interviews with researchers
Blives Row
Science Vs editor; co-hosted episode and provided context on AI environmental concerns
Quotes
"The thing about nuclear reopening building a new nuclear plant, it takes so long. The last nuclear plant that we built in the US took 15 years to complete."
Casey Cronhart
"I think the villain is our reprehensible and baffling inability to switch to renewable energy and to put any kind of real effort into getting off of fucking fossil fuels."
Blives Row
"If we had abundant solar and wind power and batteries, you know, we might be less concerned about some of this energy demand. But the reality is that grids around the world are still largely relying on fossil fuels."
Casey Cronhart
"It depends on the model. It depends what you're asking. And so there's just this really big range."
James O'Donnell
"I don't think I would do that again. I don't think that was worth the energy."
Rose RimmlerOn generating AI image of boyfriend with cat
Full Transcript
Hi, I'm Rose Remler, I'm filling in for Wendy Zuckerman, and this is Science vs. This is the show that pits facts against filling the world with AI data centers. Today on the show, AI and the environment. Lately, we've been hearing a lot about how power-hungry AI is. AI uses a s*** ton of electricity. Straining the nation's aging power grid and creating more planet warming emissions. And how thirsty it is. The amount of water that AI uses is astonishing. Asking Chate-Tee-Bee-Tee to write one email is the equivalent of pouring out an entire water bottle. One bottle of water. Let that sink in. And the major culprit here is the data centers. Warehouses full of computer servers that AI needs to function. And tech companies are trying to build more of these data centers. But people who live nearby are protesting them, often saying that they're going to compete for their electricity and use up their water. Go, go, center! We do not want a data center filled in St. Charles City. Because of all this, around some corners of the internet, using AI has become kind of a faux pas, especially if you use it for something silly. You are actively contributing to global warming and climate change, all because you want to Photoshop Chris Brown into your picture. So next time you use AI to generate an image for a meme, think about the impact on the environment first. Stop ruining your planet for a f***ing Instagram post! But on the flip side, we've got people pushing back against this idea. They say that these reports are skewed or misleading, and that the impact of AI on the environment isn't nearly as bad as a bunch of other stuff we're already doing, like eating meat or taking international flights. In fact, recently, some of the big AI companies have said that their products only use a tiny bit of power and a few drops of water for each prompt. So what's really going on here is AI actually ruining the planet or have the bots been framed. Because when it comes to AI and the environment, there's a lot of... Stop ruining your planet for a f***ing Instagram post! But then, there's science. And that's coming up after the break. And full disclosure, some AI companies do advertise on science versus. Idle money lies in your current account picking crumbs out of its belly button wondering, should I eat them? But when you start investing with Monzo, your money's always busy. It turns on regular investments, invests your spare change, and tops up your stocks and shares, I say. It even helps you make sense. That's what it's going to return. Monzo, the bank that gets your money moving. You could get back less than you invest. Monzo, current account required, UK residents 18 plus T's and C's apply. Welcome back, I'm Rose Rimmler. I'm a senior producer at Science versus, and I'm here with our editor, Blives Row. Hi, Blives. Hey, Rose. Fly, it seems like you're my AI buddy. I invite you to talk to me about controversies when it comes to AI. It's because I'm part, Roba. I would say that of the team, you're the person who is most tuned into this idea about AI using up all the energy, using up all the water. You've been into this for a while. Yes, this is actually, I am one of those people who probably shared a meme rose without knowing if it was true. On the water use, so that whatever, I do remember seeing those memes and being like, is this true? And then honestly, for me, it did make me take a step back from AI and be like, before I get involved in this, I do want to know the truth. Like, is this actually terrible for the environment? Is this actually terrible for the water? Because why would I want to integrate it into my life if it is? Right, so the first question is, why do we think AI would use so much more energy than all the other stuff that we do in our digital lives? Just like messing around on the computer, posting on Instagram, watching Netflix. Like, this is all stuff that we do pretty routinely and don't think a lot about the footprint of that behavior. Right, like looking at pictures of Jeff Goldblum. Yeah. I'm gonna do that using for just as an example, hypothetically, photoshopping Jeff Goldblum as your prom date. You know, I don't know who you might be talking about Rose, but that sounds like a pretty good use of electricity and energy in a matter in water, no matter how much it takes. So that kind of stuff also requires data centers and energy to run them. But the thing that's different about AI is that their servers are using a different kind of computer chip. So normal computing uses a CPU, but AI uses a GPU. And if you're familiar with video games, you might think of this as like a graphics card, but it's actually become the powerhouse behind machine learning. This is an extremely visual and potentially copyrighted analogy, but I was looking right on YouTube. I love a copyrighted analogy. The Mythbusters guys, they did a demonstration of a CPU versus a GPU and in their demonstration, they use paintball guns. Okay. The CPU was like programming a paint one paintball gun to draw a happy face with like one paintball pellet at a time, like firing at a piece of paper on a wall. That's a CPU. A GPU was like 200 paintball guns all bound together making a mega paintball gun and like one switches hit and they all fire at once and the image that they create is the Mona Lisa. Oh, then those guys are good. Mythbusters. So the point is that while CPUs are good at doing one task after another, the GPUs are good at doing a bunch of tasks at once and that requires a lot more energy. Okay. How much energy are you ready for that? Yeah, be ready. And I got a bit of an assist here. I talked to some journalists who have covered the stuff for years. My name is James O'Donnell. I'm a senior reporter for AI at MIT Technology Review. I'm Casey Cronhart. I'm a senior climate reporter at MIT Technology Review. So James and Casey both report a lot on AI and energy use. Mm-hmm. And about a year ago, they started a project trying to figure out like how much does an average query or prompt to say chat GPT? How much energy does that use? And they were inspired to do that because they were seeing all these numbers out there floating around that just didn't seem all that reliable. Here's Casey. These kind of wild estimates of, you know, oh, what a query to something like chat GPT uses this much water and this much energy and isn't that so much. And so I think that started to kind of get our gears turning and wondering, you know, is that right? How can we add all of this up? What does it, what does it all add up to? So Casey and James looked around for the real number, but. We learned very quickly that it's not going to be so easy to know that number. Companies, they're not particularly willing to share the details of how much energy their AI models require to answer one question. And so, you know, we weren't going to get it from them. And so. What do they say when you reached out and asked? They said in so many words, no. So. So James and Casey went a different route. They're AI models that are not proprietary. Anybody can use them, even download them, host them on their own computer, as long as they have the power to do that. These are open source models. And you can kind of open the hood, poke and prod him. And so James and Casey teamed up with experts, including academics at the University of Michigan, to measure it themselves. So they ran a bunch of different prompts through an open source, large language model called Lama. OK. And then they were able to actually measure how much energy those requests required. And so they got some answers. Are you curious? Yes, I would love to know. Give me the answers. Well, there's a range here. I was really struck throughout this project of, I think we went in and I was looking for kind of one definitive answer. You know, like, what is AI's energy burden? And I think that one of my biggest takeaways was just how much it depends. It depends on the model. It depends what you're asking. And so there's just this really big range. That was one of my biggest takeaways. So OK. You know, basically when people like us say, I asked AI, we kind of act like AI is one thing. And it's totally not. There are all these different models. And these models come in different sizes. So a model, sorry, a model is like a chat GPT or a Gemini or a Claude or whatever. Yeah. And they're within chat GPT, Gemini Claude. There are multiple models within Lama. There are multiple. And there are some are bigger, some are smaller. If you imagine that this AI model, like imagine it's the command of a spaceship or actually my favorite is like a switchboard with tons of knobs and dials. You can kind of imagine that's what these parameters are. James says. Each of those knobs is helping the AI come up with a better answer. But also each of those knobs requires energy to operate. So the smallest model that the team looked at for this analysis had 8 billion parameters. So 8 billion. Oh, that sounds big. The biggest one they looked at had 400 billion parameters. 400 billion. And when it comes to the big players here, we actually don't know how many parameters they have. But James said, if you had to guess, it's... In the order of trillions. Whoa. Really big. A lot of knobs. A lot of knobs. I mean, I'm imagining basically like a switchboard. But now I have to completely change it. Because it's like a switchboard that goes on for miles. Yes, that's right. So these parameters you're talking about, which is sort of the what underpins the model, I guess. They are their numbers, values. And the more of them that there are, the better the model is at learning patterns and making predictions, which is how large language models work. OK. OK, so say I have like one request, and I pop it into a model with 8 billion parameters. And then I pop that same request into a model with like 400 billion parameters. That same request is going to use different amounts of energy based on the model that I'm using. Yes. And actually we can move into... We don't even have to hypotheticalize here. I have real numbers for you. Oh, nice. OK. So the smallest Lama model that the team used, they fed in some prompts like teach me about quantum computing or suggest some travel tips. And then they measured how much energy that used. The smallest model, when it spit out an answer, used on average 114 joules. Oh, great, great. That's very helpful, Pimp. Are you being sarcastic? Do you want some other way to think about this? Yes, please give me something like that. Well, I didn't. You know, it was James and Casey. So they came up with something for some context. One thing they converted these energy units into is a fun new type of unit called microwave seconds. I love the microwave seconds unit. It's so much more relatable than joules or watt hours. Casey gets me. All right, so 114 joules is roughly a tenth of a second in a microwave. OK, so that's one query, small model, a tenth of a second in the microwave. Yeah, OK. The biggest model, which was 50 times bigger, that was like zapping something in a microwave for eight seconds. Oh, so that's the biggest model. Was still only for one query, the biggest model, was still only eight seconds. OK, that doesn't even get my rice remotely hot. Right. I mean, and after James and Casey published their article, frustratingly for them, OpenAI and Google did release a little bit of information on how much energy their text prompts use on average. And what they said suggests that a text query is equivalent to one or two seconds in the microwave. OK. So basically, what we can tell you is like a text prompt to a large language model is probably on the order of zapping something in the microwave for less than 10 seconds. Mm-hmm. OK. And then for images, this is a different kind of machine learning, but it also uses a fair amount of energy. And I would have assumed that this image-making thing is inherently more energy sucking than text making. But as it turns out, that is not necessarily the case. Here's James. If you have a really big, large language model that's generating text and answers, it may actually use more energy than generating an image. And that was kind of counterintuitive for me, because you think about these AI models that come up with fantastical images that we've all seen over the past few years. And it just seems like such an intense process to kind of create that from scratch. Yeah, because it always takes longer, too, than getting your text back. Yeah, exactly. But what we found was that if you have a really large text model, it has so many parameters. So it has so many knobs and dials that it actually can use up more energy than generating certain types of images. They found that making an image was like running a microwave for 5 1 1 2nds. So like a big language model can be like 8 seconds of microwave time. So that's a little less, OK? Right. I'm with you. And for all this stuff, if you don't like microwave time, you could also think about it in light bulb time. So it's like running an LED light bulb for somewhere between 10 seconds and two minutes. So I guess what this maybe tells me is that if I wanted to make my Jeff Goldblum prom picture with AI, that is a slightly less energy intensive process than perhaps using AI to write romantic Jeff Goldblum fanfiction. Possibly, if you use a really big model to write your Jeff Goldblum fanfiction. Obviously, I would need a very large model for this work rose. OK, got it. And then there's video generation. This might not surprise you to hear that that used the most energy. So the team, what they did was they looked at an open source video generation model. And they just made like a crap video. It was 16 frames a second, five seconds long. They compared to like the quality of a silent film era type film. And that one, that would be the equivalent of over an hour in the microwave. That's a, I don't think I've ever micro-aved anything for an hour. I don't think I have either a long time in the microwave, for sure. That one scares me more because I'm seeing a lot of AI generated videos out there. Yeah, that's what's huge right now. And getting bigger, right? There's a ton of, what is it, so far? Yeah, so far is big right now. Yeah. And they can look incredibly realistic, right? Right. We don't know if that's also using up as much energy. We asked OpenAI, which makes Sora, and they didn't give us any information on Sora's energy use. And James and Casey didn't want to speculate to a different model. But it's probably using a fair amount of electricity. I think that's safe to assume. So by mean, but OK, so what we have so far is all about like individual use. Yeah. But I want to know though, is like obviously lots of us are doing this, lots of us are using this. What is the impact if you add it all up, if you scoop up all the AI use that we're doing? What do we know about that? Right. It's interesting. On an individual level, certainly like the texting and image generation stuff, they're not that crazy energy intensive. But that doesn't leave AI off the hook because when you do zoom out and to answer your question, it really adds up fast because OpenAI says it receives 2.5 billion prompts per day from people around the world. Wow. And AI in general is getting integrated into all these institutions, which I think a lot of us are noticing. In fact, one survey of a variety of organizations around the world found that 78% of them are now using AI to some extent. That is a lot. And so nerds have looked at how much electricity is going to data centers to see if AI has made an impact. And they saw that from 2014 to 2023, the electricity consumption of data centers tripled. That's according to a report from the Lawrence Berkeley National Laboratories. Oh, wow. So like in this, and that's the AI period, like that's like sort of that is like the, yeah, basically the age of AI like taking off. Taking off. And the energy cycle is expected to keep sucking up more and more. And when analysis predicts that by 2028, AI data centers will use as much electricity as a quarter of US households use per year. Wow. Imagine adding 25% more households to the US in 2028. That's what the prediction is that these AI data centers are going to use up. That doesn't sound good. Well, I mean, in Casey, who's a climate reporter, she was like, you know, it's not the electricity per se. That's a problem here. That's kind of the crucial thing that I like to bring up and really harp on is that, you know, if we had abundant solar and wind power and batteries, you know, we might be less concerned about some of this, this energy demand. But the reality is that grids around the world are still largely relying on fossil fuels. So it's not good. Right now in the US, only 9% of the country's power comes from renewable sources. It's still mostly fossil fuels that we use to power our electric grid. Right. A third of our energy comes from petroleum. A third comes from natural gas, which is another fossil fuel. Both our greenhouse gas emitters. And coal. Coal is in the mix too. It's 8%. So just a lot of this energy is dirty. Mm-hmm. And of course, there are other countries with cleaner energy grids than the US. But more than half of the data centers for the world are here in the US. Well, and you know, I feel like the headlines, some of the headlines I've seen around this rows have been related to nuclear energy. Because there were headlines a while back that one of these companies was going to reopen three mile island, which is this nuclear plant that was shut down because of an accident. And so there's talk of like that being reopened. And like, you know, really a lot of these companies being very interested in what's going on with nuclear. So it does make me wonder, could nuclear help if we can get that ramped up? I asked KC about that. And she was like, the thing about nuclear reopening building a new nuclear plant, it takes so long. The last nuclear plant that we built in the US took 15 years to complete. Yeah. And companies are just not going to wait for that to happen. And they're not. They're not waiting for it. I mean, look at XAI. They brought in gas burning generators to run their data center in Tennessee. Right. Okay. Uh, okay. Well, that sucks. Yeah. So I reached out to XAI and I didn't hear back. I also contacted Google and Anthropic just to ask about all the stuff that we've been talking about. I didn't get answers from them by our deadline. Um-hm. Open AI did get back to me. They mostly pointed me to stuff that's already publicly available, open letters and blog posts, that kind of thing. Um, talking about their energy use and how they see that in the future. And basically what Open AI is saying is that they want to work with the government to add capacity to the grid. Uh, and they say that they want that energy to come from all kinds of sources, including renewables. Uh-huh. Okay. And just overall, I will say there might be some changes coming for the positive. Uh, so the energy that AI requires to answer your query or make your image or your video, that could be going down. Because a lot of the tech companies are trying to make their models more efficient. One way they're doing that is by turning off some of the parameters that we talked about earlier when they don't necessarily need them to answer a particular question or do a task. So that's like shrinking the switchboard, essentially, as needed. So there is some evidence that like the tech companies are like trying to adjust to make this thing. Yes. It might get better. Okay. So that's energy rose, but but I know there's another piece to this. Yeah. What about water? What is going on with water? Right. So we're going to talk about that after the break. Welcome back. I'm Rose Rimmler. I'm here with Blight's Rell. Hello. So let's talk about water and AI. And if you want to talk about water and AI, you call up Shalei Ren. He is a professor at UC Riverside. He's actually in the computer engineering department, but he focuses on sustainability. Most people in his field look at energy and how it's gases like we were just talking about. But Shalei has kind of forged his own path because he's thought about conserving water for just a lot of his life. I spent my first few years in a small town back in China. We just had access to fresh water, drinking water for a half an hour each day. During those half an hour, we had to use a big bucket to collect the water and use it for the rest of the day. So in my memory, I just, I never thought water is something unlimited. It's just, it's a fine under resource. Hmm. You never take it for granted. Right. And so one reason AI uses a lot of water is something that you probably heard before. The data centers get really hot because they're running all these fancy chips, doing all this computation like we were talking about earlier. And so these buildings, they often use a cooling tower that uses water to cool everything down. Just like all our human bodies, we sweat and we feel cooler. For data centers, if you use water in vibration, you can take away the heat very naturally, very efficiently. And where do they get that water from? Most typically is from the municipal water infrastructure system. So the same as where if I lived there, if I would have turned my top on. Yeah. So they get water for everyone that's water from the faucet basically. And the reason for that is they want like clean filtered water because if there was salt or minerals or like gunk in it, then it could gum up this system basically. Okay. So as the water cools the data centers, it, you know, it evaporates away. It evaporates and if I remember my like kindergarten, you know, the water cycle, when water evaporates, it eventually comes back as rain, right? So why do we need to worry about this? So the evaporated water, yeah, it still stays within our global water cycle system. It doesn't go away from the earth. But still, when the water will be coming back and where it will be coming back, that's highly uncertain. And it's very unevenly distributed across the globe. So due to the downturn climate change, we're seeing more and more uneven distribution of the water resources. So essentially the wetter regions are getting wetter and drier regions are getting drier. So even if the water is evaporated in say Arizona, that doesn't mean it'll come back as rain in Arizona, at least not anytime soon. You're correct. Okay. Oh, okay. So the argument is, it's using a bunch of water. It's drawing it out from where everyone else is getting their water. And it's not necessarily going to be replenished that easily. Well, yeah, yeah, it's going to evaporate the drinking water in Tucson and that water might next show up as a flood in Shanghai. Right. Okay. So let's talk about how much water is actually getting used here. Shaolian, his team, they went down this rabbit hole fairly recently and they published a paper that kind of went viral. In fact, a lot of people turn their results into a meme basically saying that every time you use AI, they'll stay in different ways. Like every time you chat with chat GPC, every time you write an e-mail with AI, you're consuming a bottle of water. Have you seen this, Blithe? Yes, yes. This was one of the memes I first saw and shared without evidence. I've seen videos of people filming themselves with a nice, beautiful, fresh bottle of water from the store, opening it up and pouring it down the drain and saying, this is what you're doing when you use AI or someone all dressed up and pretending to be AI, like dresses a robot and they're just guzzling water. But that's not quite accurate. Yeah, so that's a distortion of the message that we show you in the paper. Distortion. Distortion, here's what they actually found. So they found that if you have a back and forth conversation, in this case, they look at the model, they looked at was ChatGbT3. That's a slightly older model. But if you have a back and forth with ChatGbT3, medium length messages, if you go back and forth on average about 30 times, that uses up essentially the volume of a bottle of water, a half liter of water. Mm. Okay. So it's like a decent conversation that gets you to that half liter. Yeah. And that's where the meme comes from. So it's not super duper wrong, but what they're getting wrong or misunderstanding is that the fresh drinking water that's used to call the data center, that's actually only a small part of this calculation. So out of this half liter of water that we're talking about, only about 12% of it is drinking water that's used directly by the data center for cooling. Oh. The rest of it is non-potable water from elsewhere, it's drawn out of rivers, lakes, whatever, it's used in the process of making electricity. So that brings us back again to the power plants, you know, that'll chestnut. Okay, wait, wait. So it's talking, so some of this is drinking water, but some of this is like, but most of it is not. But most of it's not. But, I mean, but still, that's water in the environment could eventually become drinking water, right? Like, why does, so why does that distinction actually really matter? Well, if you think the data center moving into your town is a threat because it's going to turn out a bigger tap than yours, that's not quite right. And I asked Shelly about that. Do you think it's possible that a town will accept a data center and it uses up all the town's water? Essentially, like you live next to a data center, you turn your tap and no water comes out? I think in certain towns, it could be possible, but in most of the towns, I think the US infrastructure is tends to be at least for the water infrastructure is, they should be able to have the capacity available for data centers. He said that the biggest problems here might be likely to happen in really small towns with really older, limited water infrastructure. Okay. But I see people talking about how data centers are using up water. I think like we might be ignoring the bigger issue here, which is the water used by power plants. And by the way, if we had more wind and solar on the grid, the water use would go down. But anyway, as of right now, overall, taking into account the water used by power plants and the water use for cooling, we know that data centers consume 0.3% of the nation's water supply. I asked Shelly about this. I don't know what to make of that. Is that a lot or is that a little 0.3%? So it's roughly the same amount of total public water supply in Rhode Island. So whether this 0.3% is higher or not, I would say it's modest. It's not that much. Brings up the question, should we be letting Rhode Island use all that water? I mean, what has Rhode Island done for anyone else lately? Yes, finally, the podcast is getting around to that question, which I've also had for years. What is the point of Rhode Island? Yeah, I mean, and the water used for the data centers, for the power generation and cooling is projected to go up. It's actually expected to double in the next few years. But ultimately, Shelly and another expert I spoke to said that whether or not this becomes a problem is a regional question. It makes more sense to be granular about this. Is the water being taken from an area that doesn't have the capacity? You just can't paint with a broad brush here. So complicated, I guess. Is where we so often land. Okay, so taking all this together, Rose, where do you land? How evil is AI when it comes to the environment? I asked all of our guests, basically that same question. I kind of put it in terms of like, well, do you personally use AI knowing about all these environmental impacts? Because these are people, all these people care a lot about the environment and these issues. And all of them, Casey, Shale, James, they also, yes, they do still use AI. I'm awful at planning trips. So asking for an itinerary for going on a road trip or something, that's, I found that that's really helpful. I use it to polish my text writing to help me answer some questions. And also my students use AI to generate a paper summaries. So I've used AI for technical things like how to do certain repairs on my bike. But I've also used it for seeing what people have said on a certain topic like hikes in New England with the best views. But everybody agreed that we should be thoughtful about how we use it given this energy and water requirement as well. So it's annoying because part of me is like, you know, the companies that make this and that are using this, like they're, and that are like using it for their products and services that I'm using, like they're the ones who I want to think about their AI use, right? I want them to be thinking about whether they really need to use this or not. And I want them to be thinking about that in the context of energy use, water use, climate change, right? Like that's my dream. Yes, it's on the companies. It's on the government. I mean, I think that the, my takeaway here is that like, I'm not sure AI is the villain. I think the villain is our reprehensible and baffling inability to switch to renewable energy and to put any kind of real effort into getting off of fucking fossil fuels. Right. It's the same enemy we've been fighting for 50 years or whatever. Right. Right. Also, I think that one reason AI is getting people riled up as opposed to like those old climate offenders flying, eating meat, you know, that kind of thing is people see the value in the trade off of the environmental impact of something like taking a flight or eating a burger. There's an obvious benefit to those things with AI. Yes, some people have found it really useful, but a lot of people haven't and they just don't think it has much value at all. In fact, one survey found that 61% of people in the US think that AI has more drawbacks than it has benefits. Okay. Some more than half of us are just like, overall. Yeah. No, thank you. Exactly. Okay. I think that's one reason AI's our current villain. And in fact, I think the villain is, I think that's like a nostril on the larger villain, which is the evil monster that is keeping us glued to fossil fuels. Right. The nostril. Okay. I appreciate that picture. Okay. So I do want to know one last thing though. Has learning this and digging into all of this AI and energy and water stuff has it changed how you use AI? Yeah, a little bit. Also, I think the novelties are wearing off a bit and I was never like using it a ton, but I don't know. I was asking people about like, what's some stupid stuff that you've seen generated by AI and you're like, oh my god, that wasn't worth the energy. And I thought of my own playing around with it and I was like, remember that time I had AI generate an image of my boyfriend cuddling with my cat because my cat doesn't like him. So I was like, oh, this is what it would be like if you guys got along, you know, and I sent it to him. And I was like, I don't think I would do that again. I don't think that was worth the energy. So it has changed a little bit how you make that value assessment kind of is this is using is using AI for this thing going to like add value. Is it actually really useful for this or could I just glue salmon to his fingers and then the cat would actually maybe come over? Yeah, you know, yes Rose, let's go back to basics. Let's go back to the basics of gluing salmon to our boyfriend's fingers to get our cats alike him. Yep, that's science versus. Thanks, Bligh. Oh, and while we're here, how many citations are in this week's episode? There are 66 citations. Where can people find them? They can find them in our transcript. The link to the transcript is in our show notes. Also in our show notes, we'll put a link to the article that James and Casey wrote for MIT Technology Review. It's really good. People should go read it. And people should also check out our Instagram. We've got some interesting stuff there. Maybe even a little Jeff Goldblum content for you? Give the people what they want. Exactly. Great. Love it. This episode was produced by Rose Rimmler and Bl<|vi|> Peter Leonard, Boomi Hadakka, and Bobby Lauren. Thanks to all the researchers we reached out to, including Professor Melissa Scanlon and Special thanks to Andrew Pouliot and Jesse Rimmler. Science versus is a Spotify Studios original. Listen for free on Spotify or wherever you get your podcasts. Follow us and tap the bell for a new episode notifications. We'll back to soon.