Planning the Grid in an Age of Uncertain Demand Growth
41 min
•Jan 27, 20263 months agoSummary
As AI data centers drive unprecedented electricity demand growth, grid operators face critical forecasting uncertainty. Rather than pursuing perfect predictions, E3 experts argue utilities should reframe this as a risk management challenge, using scenario planning and financial safeguards to prepare for multiple possible futures while managing the speed mismatch between fast-moving tech companies and slow-moving grid infrastructure.
Insights
- Single-point demand forecasts create false precision; scenario-based planning acknowledging uncertainty ranges is more appropriate for managing large load additions that represent binary outcomes rather than statistical distributions
- Data center developers are submitting multiple redundant interconnection requests across utilities to hedge location risk, artificially inflating apparent demand and complicating accurate forecasting
- Financial symmetry—aligning data center financial commitments to utility risk exposure—is essential to protect ratepayers from stranded assets when speculative projects fail to materialize
- Demand response and on-site generation can serve as planning hedges, enabling faster grid connection while transmission studies proceed, though data center flexibility is more limited than commonly assumed
- A fundamental cultural mismatch exists between tech companies operating on 18-month Moore's Law cycles and utilities requiring 10+ years for transmission infrastructure, creating unavoidable delays and conflicts
Trends
Speculative behavior in data center development creating phantom demand in interconnection queues across multiple RTOsShift from single-point forecasting to scenario-based planning in long-term grid transmission outlooksGrowing use of non-firm service options and faster interconnection as incentives for data center flexibility over price signalsIncreasing reliance on behind-the-meter generation (primarily fossil fuel-based) as data centers seek operational independence from grid constraintsRegulatory focus on financial security requirements and collateral from large load customers to protect ratepayersPotential wave of natural gas plant investments to meet near-term reliability needs for data center loadsBacklog-driven slowdown in data center growth as physical constraints and interconnection queues reach capacityEmergence of risk management frameworks in utility planning previously reserved for competitive market participantsGrowing importance of demand response as a planning asset rather than just an operational toolTension between national security/economic competitiveness arguments for rapid data center deployment and grid reliability/cost protection concerns
Topics
AI Data Center Electricity Demand ForecastingGrid Operator Capacity Planning Under UncertaintyLarge Load Addition (LLA) Integration ChallengesScenario-Based vs. Point-Forecast PlanningDemand Response as Grid Planning AssetTransmission Infrastructure Development TimelinesFinancial Risk Allocation for Speculative ProjectsRate Design for Large Industrial ConsumersInterconnection Queue Management and SpeculationBehind-the-Meter Generation and Onsite ResourcesRegional Transmission Organization (RTO) Market DesignGrid Reliability Standards and Resource AdequacyUtility-Tech Company Speed MismatchStranded Asset Risk ManagementNon-Firm Service Options for Data Centers
Companies
Energy and Environmental Economics (E3)
Consultancy that released December paper on uncertain demand forecasting and governance solutions; hosts both guest e...
PJM Interconnection
Largest U.S. grid operator that lowered demand forecasts in January despite record capacity auction prices
Meta
CEO Mark Zuckerberg publicly stated power is constraint on company's growth, exemplifying hyperscaler demand pressure
MISO
Regional transmission organization using scenario-based planning with multiple futures in 20-year transmission outlooks
ERCOT
Texas grid operator pushing forward on non-firm service options and faster interconnection for data centers
People
Shana Ramirez
Director of Asset Valuation and Markets at E3; discusses sources of demand uncertainty and risk management frameworks
Arnie Olson
Senior Partner at E3 and Penn Energy Management graduate; explains consequences of forecast errors and scenario planning
Andy Stone
Host of Energy Policy Now podcast from Kleinman Center for Energy Policy at University of Pennsylvania
Mark Zuckerberg
Meta CEO quoted as stating power is constraint on company's ability to grow AI business operations
Quotes
"A data center can be built in less than a year. Some transmission and generation can take three to 10 years to build, just depending on the size and location and all of that."
Shana Ramirez•~12:00
"Either it comes online or it doesn't. So that's a huge amount of uncertainty, much larger than the uncertainty that I would normally face as a utility planner just planning my organic growth."
Arnie Olson•~14:30
"The mistake is treating the forecast as a prediction rather than just an input to making your decisions, whether that's resource procurement or investment in upgrading your transmission system."
Shana Ramirez•~32:00
"I think of it as like a cultural mismatch. Data center developers operate in a highly competitive, fast moving environment and utilities operate in almost the exact opposite and a very regulated, risk averse one."
Shana Ramirez•~42:00
"It's going to be messy for the next five years. And this is going to play out in a thousand different individual cases across the country, across all 50 states."
Shana Ramirez•~68:00
Full Transcript
Welcome to the Energy Policy Now podcast from the Kleinman Center for Energy Policy at the University of Pennsylvania. I'm Andy Stone. Electricity demand in the U.S. is growing at a breakneck pace, driven largely by power-hungry AI data centers. Grid operators and industry watchers have responded by repeatedly revising their electricity demand forecasts upwards, sometimes dramatically, to estimate where future demand is heading. Yet something unexpected happened in January. PJM Interconnection, the largest grid operator in the U.S., actually lowered its demand growth expectations in its January 14 forecast. This reduction came just after the market's most recent capacity auction, where expectations of booming power demand led to record-setting prices and costs that will ultimately be borne by consumers. There are many threads to unpack here, but today we're going to focus on one in particular, which is uncertainty. Specifically, the uncertainty around how much power AI data centers, new manufacturing facilities, and crypto mining operations will actually need from the grid. Getting this number right is critical. Overestimate demand and consumers pay for excess grid infrastructure they don't need. Underestimate it and the reliability of the grid could suffer. All the while, the challenge of getting new data centers online quickly has been framed as nothing less than a question of the nation's economic future and security. But here's the catch. Demand forecasts are inherently uncertain, even more so in today's dynamic electricity system. On today's podcast, I'll be speaking with two industry experts who suggest reframing the challenge. They argue the question isn't as much about forecast precision as it's about coming to terms with the uncertainty that forecasting inevitably entails, and then operating the grid in ways that embrace that uncertainty. This means developing rules and processes, what's broadly called grid governance, that prepare us for a range of possible futures. My guests, Shana Ramirez and Arnie Olson, are with Energy and Environmental Economics, a power industry consultancy. In December, E3 released a paper describing the uncertain demand future facing the grid and proposing governance solutions to manage that uncertainty. Shaina and Arnie will walk us through that thinking and give us a detailed look at the reliability and economic challenges Uncertain New Demand presents. Shaina and Arnie, welcome to the podcast. Thank you for having us. Thank you. And Arnie, welcome back as well to Penn. I understand you are a graduate of Penn's Energy Management and Policy Program. Yes, exactly. Yeah, I had a chance to visit the Penn campus last summer and really enjoyed it. And so I'm very happy to be on the podcast. Well, it's great to have you back, both of you back here today. So let's go ahead and get started. Power demand has been growing very rapidly in this country, and forecasting the pace of this growth has suddenly become a critical priority issue for the electric power sector and for policymakers. So to start us out, can you tell us what is at stake with these forecasts? Well, as you mentioned in your introduction, the forecasts are what the utilities and the grid operators use to develop their plans for how much infrastructure needs to be added. So what's at stake really is that question. How many new transmission lines, how many new distribution lines, how many new power plants need to be developed to serve the load that's expected? In order to have a good forecast so that you can right-size the infrastructure, you need to have an accurate forecast of how much load you expect to serve next year, the year after, and the year after. The more uncertainty in your load forecast, the less you can right-size those investments, and the higher the risk that you either over-invest, and as you noted, result in higher costs for consumers, or under-invest, and potentially degrade the reliability of the power system. So Shana, I want to take a moment here. We're going to be talking a lot about what's called large load additions or LLAs in the industry. I want to ask you, what qualifies as large and why do these projects in particular create big challenges for the grid? So there's no single industry definition of large. We see thresholds anywhere from one megawatt in some large load tariffs to 100 megawatts. I think what matters more than the absolute size is the degree to which a single customer materially can change the system. A useful way to think about it is proportionality. So adding 200 megawatts to a large RTO footprint is very different than adding 200 megawatts to a smaller vertically integrated utility. In many cases, these loads represent a step change, not a marginal increase. And then the planning challenge comes from timing and irreversibility, right? So generation and transmission take years to build and recover costs over decades. When a large load arrives quickly or departs unexpectedly, it creates a mismatch that is difficult and inexpensive to unwind or can't unwind it. So these big loads are coming in. They're coming in really fast, faster than the grid has ever seen before. And you're saying getting the, I guess, the power capacity, the generating capacity takes a lot longer. And getting the grid ready to serve these new loads takes a lot longer than it does to actually build the loads themselves. Very much so. A data center can be built in less than a year. Some transmission and generation can take three to 10 years to build, just depending on the size and location and all of that. And I just want to also get a little bit more perspective here. You mentioned 200 megawatts is an example. I assume that some of the data centers could consume that much capacity. But give us an idea. What does 200 megawatts look like in something that we might be able to relate to in terms of a number of homes that could be served with that amount of capacity or something like that? Let's say a medium-sized utility might have a million customers, and that utility might serve 5,000 megawatts of peak load. That utility, if it's growing at kind of a normal industry rate, 1% or 2% per year, might be adding 100 to 200 megawatts per year at most, just from organic growth. Now, the thing about organic growth is that it depends on, you know, a million customers, and maybe I'm adding 10,000 or 20,000 customers this year. Maybe the economy is growing at 1% or 2% or 3% or 4%. Because there are so many customers, I can really capture the uncertainty just through the law of large numbers. I can use statistics to just understand what the range of potential reasonable outcomes might be from my future load forecast. If I'm looking at a single customer that has 250 megawatts of load, that might be two years' worth of organic growth. And that's not a law of large numbers, you know, normal distribution. That's a binary distribution. Either it comes online or it doesn't. So that's a huge amount of uncertainty, much larger than the uncertainty that I would normally face as a utility planner just planning my organic growth. Now, if I have 10 of those customers that are 250 megawatts each, that's 2,500 megawatts. And remember, my utility, hypothetical utility, only had 5,000 megawatts that it was serving at the starting point. So that then becomes just a tremendous amount of uncertainty for that utility to deal with. And understanding how many power plants it needs to add and how much new transmission it needs to add, etc., becomes a really fraught exercise. So those really are, again, outsized additions. And so I guess the next question that comes out of this is there's a lot of uncertainty around how many of those new 200 megawatt or whatever they may be, you know, demand centers are coming online. And this uncertainty, as I think I mentioned earlier on, is really the focus of your recent paper from E3. And that's titled, For People Who Are Interested in Looking It Up, Forecasting Large Loads in the age of AI data centers. So I want to ask you, Shana, what are the main sources of uncertainty around how many data centers or how much data center demand will actually materialize in the coming years? The first source of uncertainty, in my opinion, is the speculative behavior that's happened with some entities. So right now, many different types of entities are positioning themselves as data center developers when they haven't ever done that before and oftentimes submitting multiple interconnection requests to multiple utilities to hedge timing and location risk when they may not ever have any desire to actually build in that utility service area. So that inflates the apparent demand, but without a corresponding increase in actual projects. And I think the second piece to that is AI is such a new technology. Nobody knows, at least I don't know, but it will stand the test of time. We don know how the compute efficiency and architecture and the business models will evolve It hard to plan for something when you don have any idea if that demand will still be there in the future And, you know, I think history tells us that energy intensity really stays constant over time. So there's a lot in here. There's speculation about how many of these centers will actually come along. there's speculation about how much energy they're actually going to use if and when they're up and running, right? Throwing the spaghetti against the wall and some of these speculative proposals from the data center developers, you just don't really know what's real and what's not. Just to chime in, there's also uncertainty about where they'll locate. So, you know, we don't have a good guide, unfortunately, to tell them if you locate here and that'll be the lowest cost interconnection. The only thing that they can do is just ask, they make a request. Can I interconnect 500 megawatts here and then the utility has to go off and study what would it take to interconnect 500 megawatts in that location. But because those studies take time and they don't know what the answer will be, what they do is put in a number of requests. Say, can I locate here? How about here? How about here? How about here? So they might have 10 requests in to different utilities for every one or two data centers that they actually intend to build, because that's what they need to do to be able to identify a good site to locate from the perspective of the system. So there's both uncertainty about how much AI demand there will be overall, as Shana noted. There's also uncertainty about where exactly it will locate and how many of those will locate in any one given utility service area. So does that mean that some of these new data centers are getting double, triple, quadruple counted in the load forecast because they're actually proposing to build in many different areas the same data center? Potentially. And I think that that is true. What we recommended is to start counting them when they're more certain. So when there has been some financial capital put down or when energy service agreements have been executed, that makes it much more real than just a mean and a cue. Well, another point that you make in the paper as well is you point out that not only do we have the uncertainty about what's coming, but we don't necessarily have really strong data on what exists out there today. I understand there's not a real, I don't know if consistent is right world, but something like that database of what actually exists on the grid today. Is that right? In terms of data centers. Yes, that's definitely true. And any type of database or anything that does have some of that information is very much incomplete. And it's also going off of, you know, the announcements of data centers that they're going to be building here or news that this is happening. So it's very much not a science at this point. Well, part of it is, you know, we have some data center developers that are, in effect, hyperscalers saying that power is the constraint on their growth. And we almost hear that they'll take as much power as you can possibly deliver. Now, that can't be true, right? There has to be a limit to their appetite at some point. But this is what the proponents themselves are saying. Mark Zuckerberg has been out there saying power is the constraint on our ability to grow our business. And just given the state of uncertainty in the AI business, as Shana alluded to before, they all are kind of in a race to develop this new technology. It's almost like a land rush. You know, if you get there first, then you have a huge advantage over all of your competitors who are trying to catch up. If you're the one that's left behind, that could be an existential risk to your business. So they're all trying to get as much done, as much compute power out there, as much large language model training as they can. And power is a constraint to that. So there's very much a land rush mentality happening right now. And the power sector is at the kind of back end of that bearing the consequences. And I want to ask you more, Arnie, about those consequences. You already hit on it a little bit, but I wonder if we could go a little bit further. You know, about the real world consequences of under and over forecasting electricity demand. Tell us a little bit more about what's at stake here. Well, we already discussed a little bit about what the consequences would be kind of in a more normal situation is if the utility over forecasts its load, that likely means that it over invests in grid capacity and that imposes some additional costs onto its rate payers. Now, if the utility continues to grow, then load growth can catch up with that overinvestment. And so it's really a matter of you built too early and there are costs associated with that. I want to ask you, when you're talking about overbuilding, we're talking about transmission, we're talking about generation, both. Just want to make sure I'm clear on that. Yeah, no, good question. It's really, it's all of it. And it just depends on the specific site, whether transmission is needed and how much or distribution, substitutions, et cetera. But yeah, just think of it as all of the infrastructure needed to generate and deliver power to a specific customer. If you under-invest or you under-forecast, then you might really put grid reliability at risk. So, you know, I have more load than I expected. I have less power than I need. I might be able to go out to the market and buy that power, but I might not be able to. So normally, the utility would look at those two risks and see a large asymmetry. It's much better to build a little bit too early and spend a little bit extra money than it is to put the grid reliability at risk and potentially have rotating blackouts because you didn't have enough power. So typically they would, all else being equal, probably over forecast or be a little bit conservative in their forecast. With these very large new loads now coming online, the amount of money that's at stake for the investment component of that is so much larger. and the risks are so much larger that that kind of turns that on its head a little bit. And because those risks are so large, that's why some additional tools and risk management becomes really important. And that's why E3 was interested in writing the white paper to sort of deal with this new form of uncertainty and to think of it from a risk management perspective. Well, so great for leading into that because Shane, I want to go into that issue with you right now. So again, as Arnie just said, this has been really kind of framed as a forecasting problem, get the forecast right, because you need to know how much infrastructure to build to support these new data centers, these new large loads. But again, the future, there is no crystal ball we don't know. So the report is about reframing this as a get the forecast nailed problem to a manage the risk problem. Tell us a little bit more about this shift in framing. Yeah. So, I mean, as we all know, a forecast is never going to be perfect. It's never going to be quote unquote right. But it's amplified in an environment with rapid technological change and speculative investment. So the mistake is treating the forecast as a prediction rather than just an input to making your decisions, whether that's resource procurement or investment in upgrading your transmission system. And what we think utilities really need to manage is the exposure. So how much risk are they taking on if load grows faster than expected? How much if it falls short? And framing it as risk management shifts that focus towards flexibility and financial safeguards. And it is more of a looking at a range of possibilities and not just what historically has been the most likely scenario. Because that's just not where we're at right now in the energy industry. One of the issues is, so we're talking about looking at the future as a risk management challenge. But just going for a moment on something that Arnie said a little bit earlier, you noted the magnitude of infrastructure that is in play here when we talk about building out the grid to meet these future demands. Hopefully they will, you know, arrive, materialize if we've built a plan for them. But a related issue here is that the electricity industry itself is not really built to operate at a level of speed and adaptability and nimbleness that is currently being called for. I wonder if you could just kind of acknowledge or talk about this fundamental tension, because it will impact the ability of data centers to connect to the grid, and it will ultimately influence how much new demand will be coming on and when. I think you need to start with the fact that electricity infrastructure is enormously capital intensive. And it just requires, historically, a long time to engineer, to plan, to permit, to construct, to commission, and to energize. So just to give you an example, a new high-voltage transmission line can often take 10 years or even longer from the time that you first start to plan that line to when you bring it online. There's a famous power line in the northwest where I'm from that connects Idaho with Oregon that is now finally well under construction. It's been 19 years for that project. That's just one example. Power plants will typically take four to six years Transformers and a lot of other things are in short supply So that you know those are the types of timeframes that we dealing with here And there's typically a lot of public process around both, you know, how much should the costs be and who should bear them, but also the environmental impacts of the infrastructure that's being built. The land use impacts from transmission lines, long linear facilities. There's emissions impacts from power plants and land use impacts as well. So the public has historically wanted to have a say in the nature, size, extent of these types of facilities. If you contrast that with the world that a lot of the hyperscalers are used to dealing with, Moore's Law is only focused on 18 months. It says you're going to double your computing power every 18 months. So they're used to building such stuff much faster. So they put constructing chips, racking them up into servers, into larger data centers, is just not nearly as capital intensive and as extensive of an exercise. So it's a bit of a clash of mindsets in a way between these new large consumers of electricity just having a time frame that doesn't match with the way that the energy sector has traditionally done business. And it's not clear that we can really speed things up that much more rapidly with respect to the electricity sector, because the grid impacts of these new facilities are very, very extensive, and they need extensive study to ensure that other people won't be harmed by these new loads coming online. Yeah, I think of it as like a cultural mismatch. And as Arnie said, right, like data center developers operate in a highly competitive, fast moving environment and utilities operate in almost the exact opposite and a very regulated, risk averse one. And for important reasons, both of those are the way that those industries work. But coming together, it's very hard. There's not ever going to be a match, I think, with the speed that the data centers want with what the utilities can actually do. As you start looking at the future again as uncertainty to be managed, you E3, working with E3, have built your own models for what future demand will look like. And in doing so, you're not throwing forecasting out the window completely, but instead you're presenting a range of scenarios of possible futures of electricity demand. And scenarios are used a lot in forecasting. The International Energy Agency, for example, uses forecasts when it looks at future global energy demand, again, for example. In your view, how and why should utilities and grid operators be using scenarios in their efforts to understand future ranges of demand and the uncertainty around that? Shana? scenarios are acknowledging explicitly the uncertainty that's happening in the industry right now whereas a single point in time forecast creates false an idea of false precision like it implies the level of certainty that just doesn't exist with the amount of new load being planned for at this time and i i use the hurricane you know cone of uncertainty analogy like you you're not going to evacuate based on the the projected landfall in one certain city the cities around that or towns around that are still going to prepare and maybe evacuate because the hurricane can change track while it's going. So that's like the way that we're trying to think of forecasting is not a single point in time. Like you need to keep testing it. You need to make sure that your assumptions are correct and that they're as accurate as they can be on an ongoing basis. So despite all this, we're still seeing grid operators. I'm thinking about PJM in the most recent case. They're still focusing on a single point forecast. So the question here might be, does this scenario type of forecasting work within the context of a competitive wholesale electricity market operator, again, like PJM, which is a regional transmission organization? Arnie, I want to ask you, why hasn't such a market published a range of possible futures when it releases its forecast? So they do publish ranges and they do use scenarios for long-term transmission planning. So, you know, MISO had, I forget, five different futures that they published when they did their longer-term transmission outlook, 20-year transmission outlook. And, you know, we always focus on MISO Future 2A as an example of one that's policy compliant, and a lot of our clients find useful for planning. So, for long-term planning, they do. And the range of uncertainty, of course, is larger the farther that you go out in time. The issue is when they get into their capacity market, so this is where they're procuring the capacity that the system needs to meet its resource adequacy standard, you know, a year to three years in advance, depending on the market and what setup they have. In those cases, the market operator is running an auction and billions of dollars are being cleared through the auction and capacity payments. So in that context, they really need to have a number that they procure too. In fact, you know, they don't have a single number. They have a sloped demand curve. So I guess you could say maybe they're embedding scenarios a little bit in their sloped demand curve. But in effect, you know, they're buying capacity for a fixed expected amount of load and they'll buy a little bit more if the price is cheaper and a little bit less if the price is more expensive. But it's because they need a number to clear the auction with. That isn't to say that that's the only generation planning anybody should do. The RTO is really almost the backstop or like the spot market for a capacity. Any individual end user should be doing its own scenario planning and doing its own hedging against that spot capacity price when it's thinking of its own kind of long term interest, the RTO run auction is just a price signal that really the market should use to do its own scenario planning. But the definition of a market is that there is no central planning. Generation is a competitive market. No one is centrally planning that generation portfolio. All of the market participants are doing their own individual scenario planning. In the solutions to all this, in managing this uncertainty, you point out that demand response can play a very important role. Again, for managing the uncertainty of the grid. And you say, quote, that demand response is a critical hedge against forecast error. I wonder if you could explain a little bit more the role of DR, demand response, in addressing this forecast uncertainty. I think the way we're thinking of it, treating it as a hedge, is that demand response can become a planning asset. It provides optionality. So if load grows faster than expected, and it reduces stranded asset risk if it does not. I would caution, though, that two things. Not all data centers are equally flexible. and assuming they are can be a big risk to your forecast. And then flexibility doesn't mean that they're shutting down their operations. They may be shifting workload to other data centers. And most likely, they're going to be relying on behind-the-meter generation. And that is, at this point, is most likely fossil fuel-based. So some of those data centers are more flexible inherently, right? I think a data center that's focused on AI training may be able to reschedule its processes where a data center that actually has to respond to queries in the moment in real time may not be. Is that right? That's exactly right. And I would even say that within the AI data centers, what we've seen through a few studies and speaking with a bunch of data center developers is that they have less flexibility than I think there's some hype around that and being able to be curtailed and all of that. because they do have contracts with their customers that they have to fulfill. And being down is very expensive for them. And you mentioned curtailed. Curtailed is when they would actually be, their supply would be reduced. The grid operator would make that decision essentially for them. Correct. I think we're seeing load flexibility, demand response, onsite generation, really all the things that you can do on the site of a data center as really being an important tool, especially for speed to market. I talked earlier about how the grid studies, impact studies that need to be done for large data centers are extensive and time consuming and the grid infrastructure can be expensive and time consuming to construct. If you can put in place a data center that has resources on site, and it's a combination again of actual compute flexibility, maybe it's storage, maybe it's on site generation, and you can agree to operate the data center in such a way that you not having any impacts on the rest of the grid that may not be the most cost optimal way for that data center to come online It might be better if it were fully grid but it a way for them to get online quicker while all the studies are being done about how to fully grid-integrate that data center. So it's a way to get online faster, and then over time, it will more and more fully participate in the grid and the wholesale market as all the studies are done to ensure that it can be done reliably. Well, I guess the next question here would be, how do you incentivize these data centers and other large loads to actually be flexible, right? So to make it worth their while financially or otherwise to be flexible. And you highlight the role of rate design in incentivizing flexibility. So for those who may not be familiar, could you tell us what rate design is and how How might it be used to incentivize large electricity consumers to be flexible in their consumption of power, as you've just described? So rate design is basically what is done to ensure it is the way that the utility recovers their costs to serve their customers. And for example, traditionally, transmission and generation are recovered in demand charges versus energy recovered in a dollar per KWH charge. And there's all different types of rate designs you can do by time of use period. So high cost periods, the rates are higher than the low cost period. So you're incentivizing people to cut their usage when they're in the high cost periods. And we just think of it as one of the few levers that utilities have that can translate the system's needs to customer behaviors. But there is the caveat that these customers are not necessarily very price sensitive. Meaning they're going to run whatever the price of electricity is, right? Right, right. So I think it's more like non-price incentives matter more. So what we've seen is faster interconnection, non-firm service options, and those can be more motivating to a data center than price signals. But we're in the gold rush time period right now. If this eventually slows down, they most likely will become a little bit more price sensitive. So on the speed to connect side, so basically what you're saying is part of the incentives here is we'll get you hooked up, but then you're going to have to be flexible in return. Is that correct? If there's not enough electric power at any given time, you're going to have to lower your consumption, but at least you're going to be able to connect to the grid. Is that right? That's what we're seeing in several markets. I think the ERCOT Texas market is really pushing forward on that. Is that correct? Yes, it is in ERCOT. In the report that we've been talking about, you also emphasize the importance of what you call financial symmetry and its importance in reducing risk and uncertainty around the magnitude of future demand. What is that financial symmetry that you're talking about and why is it so important in this context? We think of financial symmetry as like aligning the financial commitments of the data center to the amount of risk to the utility and rate of the repair. So a good example of this is there are, say, 10 data centers in the interconnection queue, and two of them are in the very earliest phases of just exploring if they can be interconnected. Well, at that time, the risk of the utility is very low from a financial commitment standpoint. There still is some risk if you're taking that as order for your forecasting. But the utility hasn't put out a bunch of capital to upgrade transmission lines or build a new generation. And so the financial commitment from the data center should reflect that low risk. However, when we're getting to doing transmission upgrades or occurring, energy service agreements are being executed, the risk to the utility and other ratepayers of stranded assets becomes much greater. And that is where, you know, you see things like kayak contribution and aid in construction payments being made. You see direct pass through transmission upgrade costs and collateral requirements that ramp up with the risk to the utility and other rate payers. You talked about rate design before. Rate design historically has been a method of ensuring equitable apportionment and recovery of costs from the variety of different customers that a utility might have. These data centers are large enough that we're now having to think about equitable allocation of risk. It's not something that the regulated utility and its regulators have had to think about before, but the risks are so large now that we're having to apply some of those same techniques. and, you know, in effect, requires some security from these new large loads that if the utility is going to make billions of dollars of investments, assuming that they're going to come, that they need to have some financial security to protect their other ratepayers against those risks. You know, basically getting to the point here that they have to abide by their commitments, essentially, is what we're talking about. Yeah, again, which is not something that's been done historically, but, you know, that's because we've been dealing with, you know, a million customers that are a few megawatts each, and not a few customers that are hundreds of megawatts each. If we're looking ahead at the future, how should our listeners be thinking about the next few years of large low growth? What is success going to look like in managing this uncertainty and managing it well, even if the forecasts keep changing? Shana, let's start with you on that one. I expect the current pace of the projected growth will eventually slow, not because the domain disappears necessarily, but just like physical constraints of not being able to have generators and all of that. Just the backlog is getting very big. But I think it's the broader takeaways, like alignment between load growth, resource development, and the system capability has to be intentional. and thought through. And that's why we're having this exact conversation now at this time. It's going to be messy for the next five years. And this is going to play out in a thousand different individual cases across the country, across all 50 states. You know, data centers are going to want to come online. They're going to put in a projection. They're going to put in a request. The utility is going to study it. There will be cases where the utility spends some money and the data center load goes away. It's unavoidable. I think we want to make sure that the financial consequences fall correctly on the data center. If they're the ones that pull out, that's a bit what the point of the paper is. But there are going to be cases like that. Hopefully, there'll be few and far between. Hopefully, the ratepayers will be protected from those impacts. And hopefully, the investments will be right-sized and a lot of this new load will come online and will be served with cost-effective new resources in a timely manner. That's really what success is. You know, these are exciting new developments from a technology perspective, potentially saving huge amounts of labor, potentially saving huge amounts of carbon if we can be more productive with the capital and labor that we have. And, you know, I don't know how everyone feels about the national security implications of this. I'm not an expert on that, but there certainly seems to be a lot of discussion, a lot of concern about that. So I think we do need to serve these new loads. We want to do so in a way that's reliable and that protects other customers. And we'd like to do so in a way that's sustainable as well. And, you know, there probably is going to be a wave of investment in fossil fuel plants to ensure that these loads can be served reliably. I don't think that necessarily means that you're locking in a lot of natural gas consumption over time. If you can follow up those reliability investments with more and more investments in clean resources like wind and solar, maybe even nuclear, then you can use those natural gas plants less and less over time. And it doesn't need to be a long-term negative for the climate. Shana and Arnie, thanks for talking. Thank you. Yeah, thanks again for having us. today's guests have been shana ramirez director of asset valuation and markets at e3 and arnie olson a senior partner with e3 for more energy policy research and insights as well as our archive of nine years of energy policy now visit the kleinman center for energy policy website our web address is kleinmanenergy.upenn.edu thanks for listening to energy policy now and have a great day Thank you. Thank you.