Cache Me If You Can

Beyond the Cloud: The Future of Data Centers

37 min
Feb 18, 2026about 2 months ago
Listen to Episode
Summary

This episode explores the physical infrastructure of data centers—the real-world instantiation of cloud computing—examining their evolution, economic impact, sustainability challenges, and critical role in powering AI and digital services. Hosts discuss how data centers function as real estate assets requiring power, cooling, and connectivity, while addressing public concerns about energy consumption and the industry's commitment to fair utility cost-sharing.

Insights
  • Data centers operate as real estate businesses first, requiring strategic location decisions based on workload latency sensitivity, power availability, and fiber connectivity rather than proximity to users
  • The industry's economic contribution is substantial: 4.7M jobs, $404B in labor income, $162.7B in taxes, and $727B GDP impact in 2023, yet remains poorly understood by policymakers and the public
  • Liquid cooling innovation is becoming essential as AI chip densities increase, shifting from air-cooled to water-cooled systems to manage thermal loads efficiently
  • Data centers can function as grid assets through battery storage and demand flexibility, reducing need for new generation capacity and potentially lowering costs for all ratepayers
  • Emerging power technologies (SMRs, fusion, geothermal) funded by major tech companies represent a 10-year horizon solution to energy constraints, requiring regulatory support for deployment
Trends
Shift from single-building 16-20MW facilities to multi-building campuses with 40-50MW+ capacity, concentrating power demand and straining regional gridsGeographic diversification of data center development away from constrained markets like Northern Virginia toward underutilized regions with available powerIndustry investment in emerging energy technologies (small modular reactors, nuclear fusion, utility-scale geothermal) as primary sustainability strategyData centers evolving from passive power consumers to active grid participants through battery storage and demand response capabilitiesIncreasing public and regulatory scrutiny of data center environmental impact, requiring industry transparency on cost-sharing and sustainability commitmentsLiquid cooling adoption accelerating in AI/HPC facilities as chip density and heat output exceed air-cooling capabilitiesRegulatory focus on grid interconnection timelines and processes to enable faster data center deployment without compromising infrastructure planningTech companies making explicit public commitments to pay full infrastructure costs rather than shifting expenses to other ratepayers
Topics
Data Center Physical Infrastructure and ArchitectureAI Workload Training and Inference DeploymentEnergy Efficiency and Cooling TechnologiesGrid Integration and Demand ResponseSmall Modular Reactors (SMRs)Nuclear Fusion Energy InvestmentGeothermal Power GenerationRenewable Energy Power Purchase AgreementsData Center Real Estate and Site SelectionUtility Regulation and Rate-SettingFiber Optic Connectivity and Latency OptimizationData Center Security and Access ControlCo-location vs. Hyperscaler Facility ModelsGrid Interconnection and Transmission PlanningPublic Opposition and Community Concerns
Companies
Microsoft
President Brad Smith announced company commitment to pay full electricity infrastructure costs for AI data centers am...
Equinix
Multi-tenant co-location provider with 270+ data centers globally; Chris Kim's employer, used as primary example of d...
OpenAI
Mentioned as developer of large language models requiring significant data center training infrastructure and compute...
Google
LLM developer and investor in geothermal utility-scale power projects to support data center energy needs
Meta
LLM developer and investor in geothermal utility-scale power projects for data center sustainability
Anthropic
Large language model developer mentioned as creator of AI models requiring substantial data center infrastructure
Amazon Web Services
Commissioned third-party study showing AWS data centers pay more than anticipated rate costs, creating excess revenue...
Dominion Energy
Virginia utility company; CEO stated data centers have driven down transmission costs for all users despite 7+ year g...
Duke Energy
Utility company; CEO on record stating large data center users have kept rates lower in their service territory
People
Matt Pearl
Host and Director of Strategic Technologies Program at CSIS; leads discussion on data center infrastructure and polic...
Chris Kim
Guest expert with 30+ years in technology and 8 years in data center industry; primary source for technical and opera...
Brad Smith
Microsoft President; announced company's commitment to pay full electricity infrastructure costs for AI data centers
Quotes
"Data centers are the places where the cloud is instantiated in the real world."
Chris KimEarly in episode
"In the real estate business, it's all about location, location, location. And I would guess that it's very similar in the case of data centers."
Matt PearlMid-episode
"Any technology that's sufficiently advanced is indistinguishable from magic."
Chris KimMid-episode
"The industry is not trying to shift expense from the industry onto other ratepayers. They're willing to pay their fair share for the power that they need to consume."
Chris KimClosing segment
"If your large users have the ability to reduce their use of the grid in those situations, you actually can invest less in building new power capacity, new transmission, new generation."
Chris KimLate episode
Full Transcript
Welcome to Catch Me If You Can. I'm your host, Matt Pearl, Director of the Strategic Technologies Program at CSIS. In this podcast, we take a closer look at the technologies and policies driving tomorrow and how the United States can stay ahead in the global innovation race. Welcome to Cash Me If You Can. I'm your host, Matt Pearl, and in this episode, we're taking a step back from the apps, platforms, and policies and looking into the physical data center infrastructure that makes our networks run. As technology and the need for faster, more powerful compute continue to scale, data centers have rapidly evolved, but not without controversy. Despite their central role in today's digital landscape and our economic growth, these facilities are increasingly facing backlash from the public, policymakers, and communities where they are built. Just this week, Microsoft President Brad Smith discussed his company's plan to ensure that it pays the full electricity infrastructure costs of AI data centers as concerns over rising energy prices and local resources strain. We're planning several episodes to better understand data centers. In this first episode, we're going to start with what they are, how they've evolved, and the debates surrounding their growth. For that discussion, I'm joined by Chris Kim. We're very excited to have him on the podcast today. Chris, welcome. Thank you very much. How have data centers evolved to technology and sustainability standards of today, and what myths surround their buildup? We'll start with the basics of data centers. Chris, you've worked at the intersection of the technical and policy aspects of data centers for many years. But for many people, data centers in the cloud remain abstract. At the most basic level, what is a data center and what role does it play in the digital landscape? Yeah, it makes sense that people don't really know what data centers are because the cloud is this construct where I can reach into the Internet and I can access what I need, whatever the resource is. But at their simplest level, data centers are the places where the cloud is instantiated in the real world. If you're using a consumer service, like you're backing up the photos that are on your phone to a cloud storage system, those photos are getting backed up over the Internet to a data center where they're stored. if you're using virtual meeting technology, if you're accessing your electronic banking, if you're using a smart home device, all of those applications ultimately have infrastructure that's deployed in data centers that allows them to work. Yeah, no, that's a really interesting point. You know, something that I've heard I'd be interested in your thoughts are is that a lot of the actual owners of the data centers, in some sense, they're in the property business, right? Which is that, Like any property owner, you're building a mall, you're developing real estate. You're concerned with obtaining the property, making sure it's in the right location, making sure you have access to the right resources like water and electricity. Do you think that's a fair point that reveals something about data centers? Absolutely. I mean, many of the largest data center companies are real estate investment trusts. So they are, in fact, real estate businesses first. But, you know, the analogy is a good one. And if you think about, say, an office building, you could build a big office building and it could be occupied by a single large company that has the whole building. It could also be whacked up into many different suites and floors and occupied by many businesses. It could be that it's a specialty office building that's near a hospital campus that just attracts medical professionals and medical practices that have a reason to be near each other in an ecosystem. And we see all of those things in the data center world. So in the real estate business, this is a tried and true phrase, right, that it's all about location, location, location. And I would guess that it's very similar in the case of data centers. What are the considerations in terms of where they're located? How close do they need to be toward, for instance, major urban centers where a lot of people are using these applications? So in the industry, we use this phrase called workloads. And that is to say the job to be done on a given set of resources, computers, routers, switches, whatever the infrastructure. And so it depends on the workload. Some workloads are very sensitive to speed, latency, the time it takes for an application to respond. You could imagine if you were driving in a self-driving car down a dark road at night and the cameras on the car were to see something in the road ahead, you'd want it to very quickly determine that was a deer and that the car should stop, right? If you're trading securities electronically in big exchanges in North Jersey, the ability to trade quickly might make for a competitive advantage that was worth real money. But there are other workloads that are not time sensitive. And what we've seen, particularly in the development of AI, is that there's a whole set of workloads that are around training large language models, Either the large language models themselves, those that are developed by OpenAI or Anthropic or Google or Meta. And there's also the training of those models to do particular tasks. So if, for example, you want to teach an LLM to become a radiologist, you would show it lots and lots and lots of radiology images and share with it what the diagnoses were in each case. And with the right training algorithm, you could teach the LLM to be able to see a new image that I'd never seen before and give you a diagnosis that would have a high rate of accuracy for at least whether or not you should investigate further, whether or not there's a problem with this image. That phenomenon can be done really anywhere. You just need access to the large sums of data and then the compute resources to make it happen. And so in those cases, they really don't have to be close to urban centers. But another part of the AI transaction, which is the inference, you're using a chatbot for a particular application. As an end user, we tend to like it when things are snappy. If it responds quickly, then we see that as more engaging. So in those cases, the inference of those models could be deployed in data centers that would be nearer to urban cores and where the most users would be. So that's super helpful. And something I think you can illuminate, there have been some recent announcements about joining different data centers and connectivity and new technologies to do that. Where does that come in in terms of what we're talking about and workloads and training and inference? Yeah, I mean, we'll almost certainly talk about this later, but when you're trying to site a data center, you know, the things you're looking for are, you know, you're looking for a piece of ground that you can construct on relatively reasonably, right? You'd like it to be flat. You'd like it to not require enormous amounts of clearing. You want it to have availability of power so that you can power the data center over the time it takes to build it or build a campus of data centers. But the other thing you're looking for is connectivity. even if you're using one of those LLM training examples, you're training a large language model to do a task, you need to move an enormous amount of data close to the computer infrastructure such that it can do the training work. And the way you do that is over fiber optic cable infrastructure, for the most part, you can run fiber, you know, almost anywhere, but the further you are from other fiber, the more costly that becomes. So, you know, in the ideal place you would find a campus that you could develop that would be near a fiber right-of-way that would make it less costly to connect the data center. Some of the technology advancements that are going on in the data center space right now are about the way you interconnect very large numbers of AI chips into a neural network. And there's been a lot of innovation in that space. And so I think that may be some of what you're referring to, Matt, is there's innovation in the technologies that connect servers within a building or within a campus. And right now there are some limitations how far one processor can be in one corner of the data center to another processor in the other corner of the data center. We optimize the way we run the cables to try and minimize those distances. Yeah. So this brings us to the technology or physical side of the data center. So it might be helpful just to walk us through what are the core components of a data center? If we walk into a data center right now? What's actually there? Yeah. So data centers are big consumers of electricity, which I'm sure we'll talk about. But that means that they have an entrance either to the utility, right? There's a connection to the local utility to get power, or there's some sort of power generation infrastructure on the site of the campus. And so at the edge of the data center, you might see a electrical infrastructure, a substation where the utility brings the great infrastructure to the data center. But at the data center itself, you would see transformers that are stepping down the voltage. You would see switch gear that controls how power moves through the building. You would see electrical rooms that would have batteries in them and systems we call universal power supplies or UPSs They there to ensure that the building can run if there an interruption of power from the utility like a thunderstorm or some other disruption You would see generators behind the data center that would run after a few minutes or a few seconds if there was an interruption in service and then would then carry the loads The batteries can carry the load for a few minutes and then the generators can run more indefinitely. you would see the building itself would be mostly allocated to computer equipment, servers, hardware, routers, network switches. And that computer hardware is usually in a space that we call the white space. It's the IT area of the data center. And often data centers are built so that that space is the domain of a certain kind of technician and expertise or customer. And then the area around the white space is the place where the equipment that supports the actual infrastructure goes. So the cooling equipment, you know, all the different infrastructure that makes the data center work. And then the final part of a data center is that, you know, the data centers have to be secure. You know, they've got enormous investments in terms of the equipment and the know-how that has gone to deploy the equipment. Customers or owners of data centers want to ensure that that equipment isn't going to be interfered with. So it's not uncommon to see a fence, maybe a big, tall one. There may be guards at the perimeter of the building. To arrive at a data center, you are authenticated repeatedly to make sure that there's a good reason for you to be there. What about cooling equipment? What role does that play? Well, it's huge. So when you look at the way data centers work, one of the biggest expenses any data center operator has is the cost of power. And so they're constantly striving to improve the energy efficiency of their buildings. What you notice is first, you're trying to provide as much of the power that you consume as possible to the white space, to the IT load that is being powered in the data center. Everything else that you have to spend power on, like cooling, like lights, like other infrastructure, is taking away from the efficiency of the building overall. And so cooling is a place where there has been constant innovation in the data center industry. There's a big transformation happening right now. In the very earliest days of computing, you saw many computers that used liquid cooling, that used water to cool the systems. But we got away from that. Most of us now have a computer under our desk or at home or at the office that's got a fan in it. And the fan blows air across the infrastructure that's in the computer, and that's enough to keep it cool. But as computer chips have gotten smaller and smaller, which is a product of Moore's law, you know, this idea that computer processors are going to get ever faster over time, they've gotten smaller and they've gotten hotter. And we're getting to the point where you can't blow enough air across some of these chips to keep them cool. But if you use liquid, if you use water, you can, in fact, cool something many thousands of times more efficiently than if you use air. So if you were to walk into a newly constructed data center that was oriented towards AI or high performance computing, you would see lots of infrastructure for liquid cooling. And we really think that's the future of cooling in the data center because the workloads have gotten, the densities have gotten to the point where it's just so hot. There's no alternative as an effective way to cool the data center. So as you talked about all these parts of the data center, I would imagine it's also enormously complex in terms of who is responsible for what, right? Which is you have the data center owner and operator. You also have the tenants. You have the companies that are using the equipment. You mentioned security. How are all those things sort of allocated and handled? Does it vary depending on the type of data center? How does that all work in terms of who actually makes all these things happen? There are different models for different situations. But if we take an example, I work for this company, Equinix, which is primarily a co-location provider. They also provide data centers for hyperscalers. But just for a minute, they have more than 270 data centers around the world. And most of them are these multi-tenant spaces. There's lots of customers in the space. So Equinix would be responsible for securing the real estate, building the building, powering that building and contracting for utilities. they would provide service level agreements that guarantee that the air will be cool, the power will be constant, the security will always be present. They would hire the security guards and put in place the apparatus to facilitate. And then like a real estate transaction, you would lease a space in the data center. And that space usually had a cage mash around it, a kind of physical barrier that made it specifically yours. And it was like sovereign ground for an embassy, right? You know, you as the customer would determine who, if anyone had access to that space. If you wanted an Equinix technician to help you out with something, to go in and deliver a package, to remove a failed card and replace it with a replacement card, they would do that for you. But you had the control to be able to say who got access in those situations. And in that way, really the domain inside the cage, inside the white space was the domain of the customer. And they were responsible for their equipment, for their cybersecurity, for their access controls, for the execution in that space. And Equinix was responsible for delivering the high-performance computing environment that let them operate in that data center. And so I would imagine that some of the details of how that works really depends on the nature of the tenant, right? Because if you're talking about a hyperscaler in their facilities, they're really going to have folks on the ground to use the equipment, to adjust the equipment, to make sure it's working. I would imagine other companies, smaller companies that use data centers, don't necessarily have the resources. How does that all work in the case of small, medium-sized businesses that might have data center needs but don't have the resources that a hyperscaler would? Yeah, there's lots of models. Again, in the Equinix model, there is a staff, a technical staff that operates in the data center 24-7, and they're available. to serve customers for their needs. And so, like I said, if there was a problem with the device and you needed to power cycle the device, then an Equinix technician could take a ticket from you and would march to the cage. They would find the specific device and they would cycle that device and it would be restored. You know, inside data center environments, inside modern IT environments, there's lots of redundancy, right? Everything is anticipated as potentially going to fail and there's a backup system to make it work. But that means that there's constantly stuff broken, right? That there's things that have failed that you as an end user may never see. But, you know, what we want to do is restore the backup system so that it's there to back up the system that's currently working. And so there's constant activity in that way. And you're absolutely right. It depends on the size of the company, the resources, the economics of the situation. Bigger companies might have staff that are there. If you went into the financial services trading kinds of facilities in the data center space, you know, a few hours before the trading day starts, in March, a lot of people, right after the trading day ends, a lot of people are out in the parking lot going home for the day. They're there to be able to run to a problem that might occur during the trading day that needed immediate intervention. And while there's stuff that happens 24-7, they don't need the same kind of response as at that moment. So, again, it depends on the situation. But for a small company, they might rely on their data center operator to provide many of the services that they need day to day. There's also an ecosystem of third party providers that exist in the communities where data centers exist that are providing different kinds of services. They might be doing electrical wiring. They might be doing backup services where they go in and they take backup tapes and take them to an offsite facility. There's a wide range of services that are provided in that way by third parties. So this is an area where we're just seeing a tremendous amount of investment. I think that's an understatement, right? There's some analysis that basically, you know, the bulk of our GDP growth in recent months has really been dependent on this data center build out. What are you seeing in terms of innovation, recent trends in data centers that you're particularly excited about as it really transforms our economy and is so central for our national security? I mean, I think we've all had the experience of using an application that we've never used before, and it can do something for us that we've previously done some other way and have it feel transformative. And I guess what I've learned over about 30 years and working in technology is that you can't anticipate all the ways life is going to change when you bring about a new technology, right? it, the way it gets applied. That's maybe the most exciting thing is that we're at this new inflection moment with this new, incredibly powerful compute infrastructure that's going to enable all sorts of things, AI and, you know, the chatbots and those kinds of applications are the things that are on our minds at the moment. The fact you can do creative image generation and video generation from prompts those things are pretty amazing You know I worked with customers who were doing things like you know basic cancer research and treating patients They were using AI to do things like, let's scan a database of all of the currently approved medications and the way that those medications actually attach to molecules in the human body and compare it to the genome sequence we have for this patient in this bed right now and see if there might be a medication that's been approved for something else that's already available that might treat this cancer patient who's in a bed right now? I mean, not even drug discovery, like I'm going to develop something for two or three or 10 years from now. No, no. Is there something I could do that might save this person's life? You know, there's some workloads you root for when you're working in this industry. And that's the kind of example. So I'm excited by that, just the power of the possibility of it all. Yeah. And I would imagine that this is an area where we really need to tell more of a story, right? Because in some sense, these technologies, and I'll admit I'm guilty of this, right? But these technologies that we've used over the years, they sort of work by magic in some sense, if you're a consumer, which is that you have the device, you have the system, and it just does what you want. And you don't necessarily see that invisible infrastructure we're talking about that actually makes it possible. Yeah, that's exactly what you get. There's a science fiction quote, right? That any technology that's sufficiently advanced is indistinguishable from magic, right? There is that phenomenon. It can feel magical. And, you know, it always got its challenges. It works until it doesn't. But, you know, somehow I've become a much better photographer, I think, thanks to the works of my telephone manufacturer over the years. There's ways the technology can be applied that bring real benefits to people. So you briefly mentioned challenges. And, you know, I think we should obviously just have a frank conversation, right? Because there's a lot of local opposition. There are a lot of concerns in terms of use of electricity, you know, water has come up as well. So can you talk a bit about those issues and what do you think are the most important sustainability evolutions that we've perhaps seen or will see in data centers to address some of those concerns? Yeah, I mean, I think as a practitioner in the industry space for the last eight years, I was surprised when we were executing so intentionally as a business that wanted to be sustainable, that had a commitment to being carbon neutral, that was funding new sustainable energy projects to discover that opponents were emerging who were saying, you're not green enough. You're not environmental enough. And the conservation movement has often been at odds with the development of new data centers. And I didn't anticipate that. I really felt like we were some of the good guys in that scenario. And so there may be some misconceptions around, you know, the industry's actions in that space. But the fact is, is that, you know, in the time I've been in the industry, the amount of power that we build into a single building has increased several fold, right? We might have built a building eight years ago that was 16 or 20 megawatts. And today we wouldn't build a building that was less than 40 or 50 megawatts in the same space. Some of that's good, right? You've got much more compute capacity and the ability to do more work in a single piece of ground encased in a certain amount of concrete with a certain amount of embodied carbon. You know, it's good that it's getting denser and more efficient. But it's also added strain to the grid because where we would have previously built two buildings over seven or eight or nine years, we're now building one building and using the same amount of power. And so we've seen constraint in the energy systems, not just in the United States, but around the world in different metropolitan areas where data centers have been built. And that has driven some changes in behavior. The data center industry is heavily concentrated in northern Virginia in the United States. It's not the only place they are. There are lots of other places in the United States where data centers are in pretty good number. But the fact that there's constraints on power in Northern Virginia now means that most developers are now looking at other places, other places in Virginia, other places around the country where they could build data centers in the future. Innovations are happening all the time, right? We see innovations in power that I think are particularly interesting, right? We have this moment in our current day where we are seeing load growth on the grid that's higher than what we experienced for a long period. And that's coming from data centers, but it's also coming from people moving to electric vehicles, people using electricity to replace gas and other fossil fuels in their homes and their businesses, and in reshoring and manufacturing. So all of these things are driving more load onto the grid. At the same time, we've also become really conscious that some of our power generation sources like coal are dirty and inefficient and pollute, and we want to retire them and replace them with cleaner, preferably sustainable, but maybe natural gas as a bridge kinds of generation technologies. But that means we've been taking capacity off the grid at the same time we've had an increasing demand, even as we're adding new sources, new renewables. So some of the stuff that I get really interested in is seeing the investments that the industry has made in funding energy in different forms. So they fund sustainable power projects like solar and wind by becoming power purchasers in advance of the development of a new site. That lets a developer have an anchor tenant and go raise capital and build a solar farm or build a wind farm. Those things did a lot to grow sustainable energy in the United States and around the world. Now, data center companies are investing in emerging power technologies, things like small modular reactors, a next generation or fourth generation nuclear reactor technology that isn't a tomorrow technology, but is a less than 10 years, probably technology that we're going to start seeing SMRs in the United States. You know, companies like Microsoft have invested in nuclear fusion startups, you know, looking at the power that the same, you know, chemical reaction that it goes on in the sun to create energy. That's a tech that, you know, is like maybe just on the precipice of becoming a real thing. You know, you see scientists have developed fusion reactions that actually generate more power than it takes to get them started. So we're on the cusp there. We've seen Meta and Google invest in geothermal utility scale power. You use the power of hot water and hot rocks that are under the earth everywhere and actually, ironically, leveraging technology that came from fracking to be able to capitalize on that resource. You can convert water to steam, steam to turn a turbine. You can generate electricity. Oh, and it's 24-7, right? It doesn't require the wind to blow or the sun to shine. And so those kinds of technologies, and again, were probably not tomorrow technologies, but maybe that's a 10-year technology. We could see whole new entrance of carbon-free power on the grid that could really change life for the better, not just for the industry, but for everybody. Yeah. And this is where it seems like the combination of the role of the private sector and the government is really critical, right? Which you have all these companies that are placing bets on different technologies. Some of them will be right and some of them will be wrong. But that's the genius of the system, right? Is that the ones that are right are really going to advance our ability to use this technology in an energy efficient way. It's also where the government, I think, can do things, both on the side of making sure that it's easier to deploy. Right. I mean, to deploy at this point, any form of nuclear energy in the U.S. is very difficult. Right. And if we're going to have these small modular reactors, you're going to need a relaxation and an ability to do it in less than, you know, decades, for instance. But it's also where the government, I spent a lot of time in government. Technology neutrality is really important, which is that the government should support these technologies, find ways to invest in them, but also ensure that regardless of which technology ends up being successful, that the government actually was supportive on the front end. Yeah, I mean, I think that's right. I mean, we're seeing stuff right now. One of the challenges for data center operators is getting grid interconnections, right? Getting to the point where they can connect a new facility. In Virginia, in Northern Virginia now, Dominion Energy says it's more than seven years if you want to build a new substation. And part of the problem has been that there have been speculators and others in the past who have been putting forward applications that were sort of dubious whether or not they were ever going to be built. So there's been a lot of discussion about, is there a way to accelerate that process for those that really do build the projects that they're imagining? And one of the notions that's out there right now is, could you create data centers to be more than just heavy power users, but that they become grid assets themselves? And so you can imagine combining a technology like utility storage large battery systems could be deployed in a facility at a data center campus And then that facility would have the ability to push power into the grid at different moments where it was needed or to disconnect from the grid and rely on the batteries in order to power their operations. And so a lot of grid planners spend their time worrying about the hottest days of the year or the highest power demand days of the year to make sure there's enough capacity that the grid won't fall apart in that situation. If your large users have the ability to reduce their use of the grid in those situations, you actually can invest less in building new power capacity, new transmission, new generation. You have to do some of that, but you might not have to do as much. So that's a very much real-time example of technologies that are proven and exist that can be applied to change the problem. And it's going to require state, usually utility regulators, to recognize this change of mindset. The U.S. Department of Energy has a part to play and has been advocating for some changes in this space. But to actually go and do this in a way that benefits everyone is something we really could do in a relatively short period of time. So those are really good points. And I would imagine also there's another angle to this, which is as we deal with concerns about data centers, which are legitimate, right, and companies and government need to do things to address it. There's also the role of AI in solving a lot of these problems, right? Which is ultimately, you talked about medical applications, like renewable energy is another area where AI applications can really advance things, right? And so if data centers aren't allowed to flourish in some sense, our ability to solve those problems and attain energy efficiency and solve societal problems like that will be limited. Yeah. I mean, I do think the technology can be part of the solution, right? And we see it you know, scales large and small. It might be the development of the fusion company and the power of the technology that they deploy and the work they do. But it also could just be, you know, in the same way that your home thermostat can learn your patterns of arrival and departure and turn it down or up as appropriate to make sure you use less energy at home. Inside the data center, we use comparable systems to try and maximize the energy efficiency of all the systems that run, those are AI-powered systems increasingly. So, yes, there's real benefits that can come from that. Chris, let me reflect back to you the takeaways that I have from the episode. We examined the physical data center infrastructure that makes our networks run. This includes the role of data centers as real estate that enables their tenants, these are the companies running workloads, including AI, to operate and provide services. This real estate dimension includes both property rights and access to critical resources. That includes water and electricity. We discussed the white space part of data centers. This is the heart of the facility, the IT area where there are servers, routers, and network switches that are operating. We additionally talked about support infrastructure, the equipment in data centers that keeps everything running, such as transformers and switchgear. And finally, we covered the importance of physical security to data centers. We unpacked the complex economic impact of the industry, including its $727 billion in contributing to our GDP. We also scrutinized the sustainability concerns surrounding power and water usage, such as the misconception that data centers shift infrastructure costs to local rate payers, and the potential for data centers to act as assets to the electrical grid. Further, we scrutinized the sustainability concerns around power and water usage, including the misconception that data centers shift infrastructure costs to local rate payers and the potential for data centers to act as assets to the electrical grid. We explained how critical it is for the industry to remain committed to being positive, engaged actors, including in terms of energy usage. The segment also envisioned a future powered by new energy technologies, such as small modular reactors, nuclear fusion, and geothermal energy. And finally, we reflected on how technology can feel indistinguishable from magic to the average consumer, which highlights the importance of demystifying the infrastructure that makes applications and services possible to both policymakers as well as the broader public. So Chris, does that summarize our discussion? And are there any other aspects of data centers that you think our listeners should know about? Yeah, I think that's a good summary, Matt, of what we've talked about so far. I think there's two things that I'd like to make sure we talk about before we go. The first is sort of an economic impact of data centers. And I think people don't realize that this has become a pretty good size industry. Every year, the data center coalition will sponsor a study that PWC has conducted in recent years on the economic impact of data centers. And in 2023, which is the last year we have data for from the 2025 report, there were 4.7 million jobs in the data center industry that paid about $404 billion of labor income in the United States. They paid $162.7 billion in taxes, state, local, and federal taxes. And the GDP impact was $727 billion. So this is a really substantial industry. And sometimes people have the idea that, well, there's not really much economic benefit from these data centers. It's not like if you built a factory and you had thousands of people working in the factory. Well, in fact, they deliver some pretty substantial economic benefits. The second is that you talked in the introduction about some recent communications about paying for power and whether or not the industry has been trying to get away with shifting some of the costs of serving the power needs of the data center industry onto other rate payers. And so there's a recent announcement from Microsoft that says they're going to be very explicit about ensuring that they never do that. Well, I can tell you from my work with Microsoft but other industry players that that's the commitment of the industry. The industry is not trying to shift expense from the industry onto other ratepayers. They're willing to pay their fair share for the power that they need to consume. And, in fact, there's some good study work out there that says mostly that's working so far. Power rates are set usually by local regulators in each state. Those power regulators have the ability to create classes of users, to set fee structures. They do long consultations with the utility providers, the transmission operators, the major users. It's a collaborative process. It can get contentious, but it produces good outcomes in most of the United States. And so we're committed to being good, positive, engaged actors in that process as an industry to make sure that we do pay our fair share. And I'll point to some research that Amazon Web Services commissioned, where a third party looked at their power usage and fees paid at different data centers across the United States and concluded that they were, in fact, more than covering the rate costs and expenses that were anticipated for their connections, creating what they call excess revenues in the industry speak, in the utility industry speak, which meant those excess revenues are being used to pay down the costs for other rate payers. Here in Virginia, Dominion Energy has said that the existence of data centers has driven down the cost of transmission costs for all users. Duke Energy's CEO is on the record saying that large users like data centers have kept rates lower in Duke Energy power situations. And there's other examples. So I guess there's this conception right now, you know, we all sort of became aware of AI, of growing data centers around the same time that utility rates really started to creep up around the United States. And so they correlate. We say, well, there's all these data centers and utility rates are going up. It must be because of the data centers. But it turns out that's not the case or it hasn't been the case so far. And regulators have the ability to set the rules to ensure that it is never the case, which the industry supports. Yeah, those are all good points. really critical. That continued commitment from industry is going to be so important because ultimately this isn't just about economic growth or some of the innovative applications we're talking about. We have the People's Republic of China that's really also building out these data centers and making a lot of progress. And so this is an area in which the U.S. has to stay competitive and that starts with the ability to build out data centers here. So Chris, I really want to thank you for joining us today and helping to demystify the infrastructure that really powers all of the technologies that we use. So please join us in two weeks for another conversation about the technologies that are shaping innovation, competition, and geopolitics. Thank you. That's it for this episode of Cash Me If You Can. Don't forget to subscribe and follow CSIS for more deep dives into the technology shaping our future.