What Surging AI Demand Means for Electricity Markets


For years and years, utilities in the US haven't seen much growth in electricity demand. The economy is generally mature and has been able to grow even without needing much more electrical power. But all that's changing now and a big contributing factor is the boom in datacenter demand. It's particularly acute for AI datacenters, which need more power than traditional datacenters, and are growing like crazy ever since ChatGPT brought generative AI to everyone's collective consciousness. So how will utilities handle the sudden surge in load growth? On this episode, we speak with Brian Janous, co-founder and chief strategy officer at Cloverleaf Infrastructure. Brian spent 12 years at Microsoft, where he was the company's first ever energy-focused hire, so he has seen the rise of datacenter electricity consumption first hand, and how AI is kicking it up even further. He now works alongside utilities to figure out how they'll meet this growing demand. We talk about how there's likely to be more gas plants being built, how datacenters and utilities can get more energy out of existing infrastructure, the politics of AI datacenters, and what this all means for the net-zero commitments of major tech companies. This transcript has been lightly edited for clarity.

Key insights from the pod
:
When Microsoft first started getting interested in energy — 4:53
How AI caught the tech industry off guard — 9:58
What AI is doing to electricity demand forecasts — 13:43
Can tech companies maintain their net zero commitments? — 22:35
The politics of datacenter electricity demand — 30:29
Do utilities have another route besides more fossil fuels? — 37:02
Are small modular nuclear reactors the answer? — 40:14
Is regulation impairing an effective buildout of the grid? — 41:50

---

Joe Weisenthal (00:17):
Hello and welcome to another episode of the Odd Lots podcast. I'm Joe Weisenthal.

Tracy Alloway (00:22):
And I'm Tracy Alloway.

Joe (00:23):
So Tracy, a thing that keeps coming up is that all theseAIdata centers are going to use a lot of electricity. I keep hearing that.

Tracy (00:31):
Yes. Also, I just realized every time you use ChatGPT to write a satirical song, you're diverting energy away from someone turning on a light bulb or something like that.

Joe (00:43):
Potentially something like that.

Tracy (00:44):
It's a zero sum game.

Joe (00:45):
Yeah, so be careful about your random ChatGPT queries, although I think the training is like the more [arduous thing], I think, so maybe your song lyric, maybe it's okay. I don't think it's that bad.

Tracy (00:58):
In the grand scheme of things, probably not. But there is this overarching conversation about AI's energy use, what exactly it is? This is a big question I have. How do you disaggregate AI servers from your run-of-the-mill software servers, how much it's going to consume, how that capacity is going to get allocated and built out?

And I think there is this sense that we could end up going into very different, very extreme directions here. So you could have this great situation that because AI is a desirable activity, because it's profitable in many respects, that big tech ends up accelerating the energy capacity build out. Maybe they even start building more green technology capabilities in an ideal world.

But then you have the polar opposite scenario where you need all this power to develop this technology. There isn't enough and it's sort of a race to the bottom where you have tech companies just trying to get energy wherever they can. Maybe they even start using coal and things like that. So it feels like there's two very different paths that we could be going down here.

Joe (02:08):
Yeah, there's a lot here for us. I remember Jigar Shah, when we interviewed him at the Texas Tribune conference over a year ago, he brought this up. It's getting more and more attention. It keeps coming up on the side of episodes.

Steve Eisman obviously recently talked about it, but I feel like it's time to make it a central part of the conversation and actually learn about numbers and where this power is generated from and, yeah, how much are we really talking about here? We know the tech company is highly aware of it. There was a headline recently about Microsoft maybe wanting to do something with onsite nuclear development.

Tracy (02:44):
No, they did do something so I think they bought…

Joe (02:47)
Well, they put out a headline.

Tracy (02:49)
Well, didn’t they buy a data center next to a nuclear power plant, a Susquehanna thing? I thought they did.

Joe (02:53):
Yeah, I think you're right about that. But then the other element too is, you mentioned that one solution here is just fuels and dirty energy, except that all these tech companies are very progressive-minded and they all have these net zero commitments about how we're going to get it all from windmills or, sorry, wind turbines and solar and batteries and stuff.

Tracy (03:13):
Windmills next to AI servers would be an interesting one.

Joe (03:17):
But, you know, at some point the rubber's got to hit the road with how realistic are their net zero commitments or how can they achieve them if they're engaging in this investment activity that is highly energy intensive.

Tracy (03:27):
No, absolutely. And you are seeing a lot of this discussion reflected in the conversation around AI investment at this point. So I think a lot of people feel like they missed that first wave of chips around Nvidia. So everyone's looking for the sort of second order investment play and a lot of people now are talking about energy or cooling and HVAC. So we need to talk about it.

Joe (03:50):
Well, I am really excited because I do believe we have the perfect guest for this topic. Someone who spent 12 years at Microsoft way before it was hot to talk about how tech companies needed all this electricity and energy.

He was the first energy hire at Microsoft and he recently left, last year he left to start his own firm to work on this problem specifically. So we are going to be speaking with Brian Janous, he's the co-founder and chief strategy officer at Cloverleaf Infrastructure, a power development company that works closely with utilities on solving this problem. So Brian, thank you so much for coming on Odd Lots.

Brian Janous (04:29):
Thank you for having me. Really excited to be here.

Joe (04:31):
So you got hired from Microsoft 12 years ago to do energy, and at the time I don't think anyone was talking about energy as being a particularly important aspect of these software companies or these big tech companies’ strategies. What was going on back then, or what did they see when they felt like ‘Hey, we need to hire a VP of energy here?’

Brian (04:53):
Yeah, it was actually funny because I had spent my career prior to that working with mainly large energy consumers who you'd expect it to be the big industrial companies. And so when Microsoft came calling and said ‘Hey, we need to get a full-time energy person,’ I told them it sounded like a dead end job to be the energy person at a tech company because why would they ever actually care about this issue?

And the person that was recruiting me said ‘Hey, I think there is something to this whole cloud thing. And I think energy is going to start to be pretty central to what we're doing as a company.’ And fast forward a decade, and I remember having a conversation before I left the company, I was talking to the head of corporate strategy and he said to me, he's like ‘I don't think people quite realize the degree to which Microsoft is really just an energy company. We need power and we need silicon, we need chips. That's it. That's the business. If we don't have one of those two things, we're in a lot of trouble.’

And so it was remarkable to see the shift over that decade plus of maybe one or two people at the company starting to think that energy might be important to us one day too. Energy is actually absolutely central to everything the business does.

Tracy (06:06):
So talk to us a little bit more about that cultural shift, because Joe and I heard from someone else recently, but they were saying that a bunch of the big tech companies that you would be very familiar with had representatives down in Houston for CERA week -- so the annual energy conference, which they were describing as kind of a new development. But how familiar are tech companies nowadays with energy usage or needs and how much expertise have they actually built out in that capacity?

Brian (06:38):
Yeah, it’s been a tremendous shift. And I mean if you would've gone to CERA week, even five years ago, you would not have seen a whole lot of engagement from the tech industry. But as their businesses have shifted to the cloud and as the business opportunities that sit in front of them, particularly when it comes down to AI, as those have arisen, there's been a recognition that that power really is central to what they're doing.

And it was a slow shift. If you go back to the advent of the first cloud data centers, it really was about being close to the network. And so that was the driver of strategically, where do you put data centers? Well, you put them where the biggest network hubs are. So that's why we have lots of data centers in Northern Virginia. That's why we have lots of data centers in Amsterdam. Everyone was chasing network

Probably middle of last decade, there was a shift and it started to go ‘Actually, we want to be close to eyeballs.’ So this started sort of a land grab of all the cloud data centers starting to build lots of new data centers and new countries. They wanted to be close to where the customers were.

And so from about fall of 2019 through probably spring of 2022, I think Microsoft was adding close to a region a month in terms of new data center regions. They were establishing around the world. And then in mid-2022, that's when the realization started to sink in that, wait a minute, this whole game is about power. Because that's when we were first starting to hear rumblings of what OpenAI was working on and the scale of what ChatGPT-3 was going to be, which was sort of the first big release where everyone was like, ‘Wait a minute, this is kind of a big deal, what AI is doing and how fast it's moving.’

And then when we had that release of ChatGPT-3 in the fall of 22, and then shortly thereafter, 3.5 was released and there was a massive increase in capability in that release, if you recall, in terms of what it could do on getting scores on various tests and things. And it was that moment that I realized this technology is moving way faster than the utility industry is moving. If we can make this much improvement in this technology in a six-month time horizon, we are in a lot of trouble because the power industry does not move that fast.

Joe (09:12):
So I'm really fascinated by this idea that the release, you know, you were at Microsoft and so you had a front row seat to what OpenAI was doing with GPT-1 and GPT-2, and there were a lot of people aware of this, and I'm sort of fascinated by this idea that it was that commercialization or sort of making easy in public, when it became ChatGPT. They're like ‘Oh, this is serious.’ And then we saw everyone rushing to buy Nvidia chips and all these VCs pivoting to AI, etc. So talk to us about the math there. It feels like there has been this sort of level shift up in expectations of data center demand growth basically as a function of all of the excitement for AI.

Brian (09:58):
Yeah, and you're right, I mean it's not like we didn't know that Microsoft had a partnership with OpenAI and that AI was going to consume energy. I think everyone though was a bit surprised at just how quickly what ChatGPT could do just captured the collective consciousness.

You probably remember when that was released. I mean it really sort surprised everyone and it became this thing where suddenly, even though we sort of knew what we were working on, it wasn't until you put it out into the world that you realize maybe what you've created. That's where we realized we are running up this curve of capability a lot faster than we thought. A number of applications that are getting built on this and the number of different ways that it's being used and how it's just become sort of common parlance. I mean, everyone knows what Chat GPT-3 is, and no one knew what it was the month before that.

So there was a bit, I think of a surprise in terms of just how quickly it was going to capture the collective consciousness and then obviously lead to everything that's being created as a result. And so we just moved up that curve so quickly and I think that's where the industry maybe got, certainly the utilities were behind because as you may have seen there, a lot of them are starting to restate their low-growth expectations.

And that was something that was not happening right before that. And so we've had massive changes just in the last two years of how utilities are starting to forecast what forecast. So if you take a look at a utility like Dominion in Virginia, so that's the largest concentration of data centers in the United States. So they're pretty good representative of what's happening. If you go back to 2021, they were forecasting load growth over a period of 15 years of just a few percent.

I mean it was single-digit growth over that entire period. So not yearly growth, but over 15 years, single-digit growth. By 2023, they were forecasting to grow 2X over 15 years. Now keep in mind this is an electric utility. They do 10-year planning cycles. So because they have very long lead times for equipment for getting rights of away for transmission lines, they aren't companies that easily respond to a 2X order of magnitude growth changed over a period of 15 years.

I mean, that is a massive change for electric utility, particularly given the fact that the growth rate over the last 15 to 20 years has been close to zero. So there's been relatively no load growth in 15 to 20 years. Now suddenly you have utilities having to pivot to doubling the size of their system in that same horizon.

Tracy (13:10):
I want to ask a very basic question, but I think it will probably inform the rest of this conversation, but when we say that AI consumes a lot of energy, where is that consumption actually coming from? And Joe touched on this in the intro, but is it the sheer scale of users on these platforms? Is it, I imagine the training that you need in order to develop these models. and then does that energy usage differ in any way from more traditional technologies?

Brian (13:43):
Yeah, so whenever I think about the consumption of electricity for AI or really any other application, I think you have to start at sort of the core of what we're talking about, which is really the human capacity for data, like whether it's AI or cloud, humans have a massive capacity to consume data.

And if you think about where we are in this curve, I mean we're on some form of S-curve of human data consumption, which then directly ties to data centers, devices, energy consumption ultimately, because what we're doing is we're turning energy into data. We take electrons, we convert them to light, we move them around to your TV screens and your phones and your laptops, etc. So that's the uber trend that we're riding up right now. And so we're climbing this S-curve. I don't know that anyone has a good sense of how steep or how long this curve will go.

If you go back to look at something like electricity, it was roughly about a hundred year. S-curve started in the beginning of last century. And it really started to flat line, as I mentioned before, towards the beginning of this century. Now we have this new trajectory that we're entering, this new S-curve that we're entering that's going to change that narrative. But that S-curve for electricity took about a hundred years.

No one knows where we are on that data curve today. So when you inject something like AI, you create a whole new opportunity for humans to consume data, to do new things with data that we couldn't do before. And so you accelerate us up this curve. So we were sitting somewhere along this curve, AI comes along and now we're just moving up even further. And of course that means more energy consumption because the energy intensity of running an AI query versus a traditional search is much higher.

Now, what you can do with AI obviously is also much greater than what you can do with a traditional search. So there is a positive return on that invested energy. Oftentimes when this conversation comes up, there's a lot of consternation and panic over ‘Well, what are we going to do? We're going to run out of energy.’

The nice thing about electricity is we can always make more. We're never going to run out of electricity. Not to say that there's not times where the grid is under constraint and you have risks of brownouts and blackouts. That's the reality. But we can invest more in transmission lines, we can invest more in power plants and we can create enough electricity to match that demand.

Joe (16:26):z
Just to sort of clarify a point and adding on to Tracy's question, you mentioned that doing an AI query is more energy intensive than, say, if I had just done a Google search or if I had done a Bing search or something like that. What is it about the process of delivering these capabilities that makes it more computationally intensive or energy intensive than the previous generation of data usage or data querying online?

Brian (16:57):
There's two aspects to it, and I think we sort of alluded to it earlier, but the first is the training. So the first is the building of the large language model. That itself is very energy intensive. These are extraordinarily large machines, collections of machines that use very dense chips to create these language models that ultimately then get queried when you do an inference.

So then you go to ChatGPT and you ask it to give you a menu for a dinner party you want to have this weekend, it's then referencing that large language model and creating this response. And of course that process is more computationally intensive because it's doing a lot more things than a traditional search does. A traditional search just matched the words you put into a database of knowledge that it had put together, but these large language models are much more complex and then therefore the things you're asking it to do is more complex.

So it will almost by definition be a more energy intensive process. Now, that's not to say that it can't get more efficient and it will, and Nvidia just last week was releasing some data on some of its next generation chips that are going to be significantly more efficient than the prior generation.

But one of the things that we need to be careful of is to think that because something becomes more efficient, then therefore we're going to use less of the input resource. In this case, electricity. That's not how it works, because going back to the concept of human capacity for consuming data, all we do is we find more things to compute. And this is, you've probably heard of Jevons paradox, and this is the idea that, well, if we make more efficient steam engines, he was an economist in the 1800s and he said ‘Well, if make more efficient steam engines, then we'll use less coal.’

And he is like ‘No, that's not what's going to happen. We're going to use more coal because we're going to mechanize more things.’ And that's exactly what we do with data just because we've had Moore’s Law for years, and so chips has become incredibly more efficient than they were decades ago, but we didn't use less energy. We used much more energy because we could put chips in everything.

So that's the trend line that we're on. It's still climbing that curve of consumption. And so no amount of efficiency is going to take us at this point, at least because I don't believe we're anywhere close to the bend in that S-curve. No amount of efficiency is going to take us off of continuing to consume more electricity, at least in the near term.

Tracy (19:36):
So I have another basic building block kind of question, but when we say that technology companies are aware of the importance of energy usage or availability and that this is something they have been working on, what exactly is the process by which a tech company gets its energy? So you have a big data center. I imagine you have some sort of agreement with whatever utility is in that area, but I also imagine that that agreement looks very, very different to my household energy bill or something like that?

Brian (20:14):
I'm certain it does

Tracy (20:16):
Hopefully!

Brian (20:17):
Significant orders of magnitude. Yes. So there's two components. I mean, one is if you're building a data center, you have to plug it in somewhere. You've got to plug it into the grid. So there you're working with your local electric utility or transmission company and doing planning for how big is this facility be? How much power is it going to pull off the grid at any given time?

And then over a period of time, because these facilities just tend to grow forever -- and so that's the nuts and bolts of connecting to the grid. Now the second piece of course is there needs to be some generation source as well. Where's the power going to come from? Those two things are related, but they can be somewhat disconnected. This is where you see these, especially the tech companies who've really been leaders in this space, entering into all these power purchase agreements for wind energy and for solar energy.

And in some cases, nuclear, you mentioned the project earlier, that's actually an AWS project where they've cited next to the Susquehanna nuclear plant. So all of that is around where are the electrons going to come from and how can with that purchasing power of being some of the largest energy consumers on the planet? How can they begin to influence the mix of generation on the grid?

And that's the critical issue is that you're trying to influence where that power is being generated from. And one thing just to keep in mind is that the electrons you get, whether it's at your house or the data center down the street, they're all the same electrons. You're all pulling from the same grid, but what you're trying to do is influence how that generation is being created. And that's where these purchase agreements come in for all these different sources of energy.

Joe (22:07):
Alright, now let's bring the question back to say the utility side or say the Dominion side. So Dominion executives for decades have basically seen no growth and then suddenly in the span of a year they're like ‘Oh, actually we're going to double.’ What do they do? What are they doing right now, today, we're recording this April 10th, 2024. What are they doing right now to expand generation or expand the grid or whatever it is to meet that doubling of demand?

Brian (22:35):
Well, this is where it gets a little concerning is that you have these tech companies that have these really ambitious commitments to being carbon neutral, carbon negative, having a hundred percent zero carbon energy a hundred percent of the time, and you have to give them credit for the work they've done.

I mean, that industry has done amazing work over the last decade to build absolutely just gigawatts upon gigawatts of new renewable energy projects in the United States all over the world. They've been some of the biggest drivers in the corporate focus on decarbonization. And so you really have to give that industry credit for all it's done and all the big tech companies have done some amazing work there.

The challenge though that we have is the environment that they did that in was that no growth environment we were talking about. They were all growing, but they were starting from a relatively small denominator 10 or 15 years ago. And so there was a lot of overhang in the utility system at that time because the utilities had overbuilt ahead of that sort of flatlining. So there was excess capacity on the system.

They were growing inside of a system that wasn't itself growing on a net basis. So everything they did, every new wind project you brought on, every new solar project you bought on, those were all incrementally reducing the amount of carbon in the system. It was all net positive.

Now we get into this new world where their growth rates are exceeding what the utilities had ever imagined in terms of the absolute impact on the system. The utilities’ response is ‘The only thing we can do in the time horizon that we have is basically build more gas plants or keep online gas plants or coal plants that we were planning on shuttering.’

And so now that the commitments that they have to zero carbon energy to be carbon negative, etc., are coming into contrast with the response that the utilities are laying out in their what's called integrated resource plans or IRPs.

And we've seen this recently just last week in Georgia. We've seen it in Duke and North Carolina, Dominion and Virginia. Every single one of those utilities is saying ‘With all the demand that we're seeing coming into our system, we have to put more fossil fuel resources on the grid. It's the only way that we can manage it in a time horizon we have.’ Now, there's a lot of debate about whether that is true, but it is what's happening.

Tracy (25:11):
So when push comes to shove, it seems like some of the green priorities are getting superseded by existential pressures on the business model perhaps? And we could debate how transferable AI actually is at this point and how big a moat you have over something like ChatGPT or Claude, or something like that.

But there does seem to be this sense of urgency among tech companies where if you're not building something out right now and trying to dominate the market and really produce the best thing possible, well you're either losing billions of dollars or you're going to be superseded by someone who does manage to do that successfully.

Brian (25:55):
That's exactly right. And it's probably not billions of dollars. It's probably trillions of dollars.

Tracy (25:59):
Yes, yes.

Brian (25:59):
And that's where the competitive pressure is coming in. And this is why there's such a focus right now in this industry on where is this power going to come from? Because the ability to at least envision and on paper design training models that are absolutely enormous, just orders of magnitude bigger than anything that we've ever built in terms of a data center, are coming into stark contrast with the reality of the power system of one, is that power even available? And two, if it could be available, is there a way to do it with a zero carbon approach? Which is again, what these companies are committed to.

And that's the tension that we're in right now, of how do we quickly accelerate the delivery and growth of the electric grid, which, and I think I just want to -- a quick aside on this -- consuming electricity is in the context we're talking about is a really great thing.

I mean, this is something that leads to economic growth, it leads to job creation, all of this. I mean this whole problem that we have right now of electric utilities having to think about this whole new era of growth. It's all because we're onshoring manufacturing in the United States, we're building these data centers, are creating all sorts of amazing tools and creating efficiency across all sorts of sectors. And, in the same vein, we're also electrifying transportation and heating.

All of this is good, it's all goodness. And we didn't even get to things like hydrogen production and other ways that we're going to use electricity. The real rub of this though is that we're in this situation right now where again, the electricity industry was somewhat surprised by this. They weren't prepared for over a period of a couple years, again, going back to the case of Dominion, having to double their load forecast.

So reflexively they're going to go to the one thing they know how to do, which is build gas plants. Because they know that works. That's the easy way out. There are other things we can do though. There are ways we can leverage the existing system more effectively. We can use things called grid-enhancing technologies where through censoring, through better dynamic rating of power lines, we can actually get more out of the existing system we have.

There's ways we can use storage more effectively because really what we're trying to manage is just these system peaks. Most of the time there's plenty of power. It's really just during the hottest summer hours or the coldest winter hours that the system gets constrained. And that's driving a lot of the need for utilities to want to build this new capacity, but we can manage it in other ways.

And it's really incumbent upon the data center industry to lean in on this. Just think through how can we be more of a party to solving this problem? Because data centers have lots of opportunities to be more flexible. They have behind the meter generation, they have behind the meter storage. They can actually be part of the solution, not just part of the problem.

Tracy (29:08):
I just want to press you on this point because I know people will have questions about this. And I take the point about in many respects we're talking about increased energy usage as a result of new things that are leading to, you know, new jobs and new productive industry. And also the idea that well, we can produce more electricity in different ways or we can make the delivery of electricity more efficient and all those types of things.

But I think one of the reservations people might have about this is the idea of competition with large tech companies that have a lot of money and that potentially have a lot of influence over the utility companies. And the idea that maybe you could get a situation where, I don't know, Amazon gets a hundred percent offtake from some power plant in whatever state and maybe other people are left with either not enough electricity or, more likely, much more expensive electricity. Can you talk about that? I was being somewhat facetious in the intro talking about a zero sum game, but there is this idea of competition and there might not be enough to go around, at least at the precise times that everyone might want it.

Brian (30:29):
That's right. And that's the big challenge that good planners have today is what loads do you say yes to and what are the long-term implications of that? And we've seen this play out over the rest of the globe where you've had these concentrations of data centers. This is a story that we saw in Dublin, we've seen it in Singapore, we've seen it in Amsterdam.

And these governments start to get really worried of ‘Wait a minute, we have too many data centers as a percentage of overall energy consumption.’ And what inevitably happens is a move towards putting either moratoriums on data center build out or putting very tight restrictions on what they can do and the scale at which they can do it. And so we haven't yet seen that to any material degree in the United States, but I do think that's a real risk and it's a risk that the data center industry faces.

I think somewhat uniquely in that if you're the governor of a state and you have a choice to give power to a say new EV car factory that's going to produce 1,500, 2,000 jobs versus a data center that's going to produce significantly less than that, you're going to give it to the factory. The data centers are actually the ones that are going to face likely the most constraints as governments, utilities, regulators start wrestling with this trade-off of ‘Ooh, we're going to have to say no to somebody.’

And that's the real risk that I think the AI and data center industry faces today is that they are the easiest target because everyone loves what data centers do, but no one particularly loves just having a data center next door to their house. And so that's a real challenge for the industry is that they will start to get in the crosshairs of these regulators, leaders, whomever, who's pulling the strings as these decisions start to get made.

Joe (32:47):
So I just want to make two random thoughts that were in my head. I walked by a film set in the East Village the other day. They were filming this movie and there were all these big thick electrical cables that are powering the lights and cameras and all that stuff. And I thought to myself ‘Oh, it'll be so great when they can just make all the movies on AI with SORA or something like that and then we'll also get electricity savings. We won't have to have human actors with actual lights and stuff like that.’

So that'll be exciting. I'm being a little facetious about the end of human actors, but in theory that could be exciting. And then you mentioned it was like, well, the utilities got surprised by the spike in demand, but it sounds to me like we can't really blame the utilities too much because if even the people inside Microsoft got a bit caught unsurprised by the explosion of AI interest in the fall of 2022, then I guess we can't really blame Dominion if they hit. They were probably even further away from the issue.

You mentioned peak demand, and this gets to the type of power because people talk about this sort of need, the problem with renewables as well, at least when we're talking about solar and wind, there's this intermittency problem. It's not always sunny, even if it's hot when it's hot, it's not always windy, there's nighttime, etc. How much does that constrain the ability of more renewables to be sort of the solution to the utilities problem?

Brian (34:14):
It's a real challenge because again, as you noted, we're trying to manage peak demand. That's what all this growth is about. So peak demand is about the certainty that you're going to have power during those highest system peaks, the hottest days, the coldest winter nights.

And you can't always guarantee that renewable generation will be online during those times. And this is the role of the system planner is to look at all these different resources and figure out how can we assure that we have this sufficient reserve margin to ensure that we're not going to have things like rolling brownouts or blackouts.

Now, there's a lot of tools though that we have to help manage that uncertainty. And we have increasingly month after month, it seems like lower cost battery options, which give us more duration that we can deploy to solve some of these issues.We have the ability of even the loads through virtual power plants to be more responsive during these times of system peaks, right?

So we have tools that we can use to manage that uncertainty. The problem is it is a very complex problem. I mean, you're talking about millions of different data points that you're trying to manage, and the way that utilities have historically managed these things has been fairly rudimentary in terms of their sophistication. And so they're having to go through this learning curve of how do we ensure that we can achieve the load growth that all these industries are expecting and meet the reliability, cost, availability, expectations of our customers?

And that's where the challenge comes in. And this is where the whole problem gets frankly, really interesting, is that there are lots of levers that we have and we don't just have to throw more fossil fuel plants at this problem.

Does that mean we're not going to build any new gas plants in this country? I'm certain we will. I don't think there's a way around this problem, at least in the short run, without having some incremental addition of fossil-based resources. But there's also a lot of other things we could be doing that would significantly reduce dependence on fossil-based resources to achieve the growth objectives that we have as a country.

Tracy (36:36):
What are the levers specifically on the tech company or the data center side? Because again, so much of the focus of this conversation is on what can the utilities do, what can we do in terms of enhancing the grid managing supply more efficiently? But are there novel or interesting things that the data centers themselves can do here in terms of managing their own energy usage?

Brian (37:02):
Yes. There's a few things. I mean, one is data centers have substantial ability to be more flexible in terms of the power that they're taking from the grid at any given time. As I mentioned before, every data center or nearly every data center has some form of backup generation. They have some form of energy storage built into this.

So the way a data center is designed, it's designed like a power plant with an energy storage plant that just happens to be sitting next to a room full of servers. And so when you break it down into those components, you say, okay, well how can we better optimize this power plant to be more of a grid resource? How can we optimize the storage plant to be more of a grid resource? And then in terms of even the servers themselves, how can we optimize the way the software actually operates and is architected to be more of a grid resource?

And that sort of thinking is what is being forced on the industry. Frankly, we've always had this capability. I mean, we were doing, I mean we did a project like 2016 with a utility where we put in flexible gas generators behind our meter because the utility was going to have to build a new power plant if we didn't have a way to be more flexible.

So we've always known that we can do this, but the industry has never been pressurized to really think innovatively about how can we utilize all these assets that we have inside of the data center plant itself to be more part of the grid. So I think the most important thing is really thinking about how data centers become more flexible. There's a whole ‘nother line of thinking, which is this idea of, well, utilities aren't going to move fast enough, so data centers just need to build all their own power plants.

And this is where you start hearing about nuclear and SMRs and infusion, which is interesting, except it doesn't solve the problem this decade. It doesn't solve the problem that we're facing right now because none of that stuff is actually ready for prime time. We don't have an SMR that we can build today predictably on time, on budget.

So we are dependent on the tools that we have today, which are things like batteries, grid enhancing technologies, flexible load, reconductoring transmission lines to get more power over existing rights of ways. So there's a number of things we can do with technologies we have today that are going to be very meaningful this decade and we should keep investing in things that are going to be really meaningful next decade. I'm very bullish on what we can do with new forms of nuclear technology. They're just not relevant in the time horizon. The problem we're talking about [now].

Joe (39:52):
At some point, we're going to do an Odd Lots episode specifically on the promise of small modular reactors and why we still don't have them despite the seeming benefits. But do you have a sort of succinct answer for why this sort of seeming solution of manufacturing them faster, etc., has not translated into anything in production?

Brian (40:14)
Well, quite simply, we just forgot how to do it. We used to be able to build nuclear in this country. We did it in the seventies, we did it in the eighties, but every person that was involved in any one of those projects is either not alive or certainly not still a project manager at a company that would be building nuclear plants, right?

I think we underestimate human capacity to forget things. Just because we've done something in the past doesn't mean that we necessarily can do it. Again, we have to relearn these things, and as a country, we do not have a supply chain. We don't have a labor force. We don't have people that manage construction projects that know how to do any of these things.

And so when you look at what South Korea is doing, you look at what China's doing, they're building nuclear plants with regularity. They're doing it at a very attractive cost. They're doing it on a predictable time horizon, but they have actually built all of those resources that we just simply don't have in this country that we need and we need to rebuild that capability. It just doesn't exist today.

Joe (41:19):
One of the things that when we're talking about utilities, they're like weird companies because they're not like normal businesses. They're sort of natural monopolies. They price set, my understanding is, based on how much they invest. And so they have to then petition some local regulators [and] say ‘Look, we had to invest this much and that's why I wanted to raise the prices this much, etc.’ Are there regulatory hurdles or things about the regulatory system right now that are going to make that doubling of demand more challenging than they need to be?

Brian (41:50)
:
Absolutely. And so if you go back to the era that we've been in of relative no load growth, if you're a utility regulator and utility comes and asks you for a billion dollars for new investment and you're used to saying ‘no,’ you're used to saying ‘Well, wait a minute. Why do you need this? What is this for? How is this going to help manage again, reliability, cost, predictability, etc.?’

Now you're in this whole new world and going back to this concept of we easily forget things -- no one who's a regulator today or the head of utility today has ever lived through an environment where we've had this massive expansion of the demand for electricity. So everyone now, including the regulators are having to relearn, okay, how do we enable utility investment in a growth environment? It's not something they've ever done before. And so they're having to figure out, okay, how do we create the bandwidth for utilities to make these investments?

Because one of the fundamental challenges that utilities have is that they struggle to invest if there's no customer sitting there asking for the request, so they can't sort of invest. I mean, if I'm Nvidia and I'm thinking about the world five years from now and think ‘Wow, how many chips do I want to sell in 2030?’ I can go out and build a new factory. I can go out and invest capital and I can go do all, I mean, I don't need to have an order from a Microsoft or an Amazon or a Meta to go do that. I can build speculatively.

Utilities can't really do that. They're basically waiting for the customer to come ask for it. But when you have all this demand show up at the same time, well, what happens? The lead time start to extend. And so instead of saying ‘Yeah, I'll give you that power in a year or two years,’ it's now like, ‘Well, I'll give it to you in five to seven years.’ And so that's an unsustainable way to run the electric utility grid. So we do need regulators to adapt and evolve to this new era of growth.

Tracy (44:00):
This is actually exactly something that I wanted to ask you, which is we're sort of used to at this point when we talk about industrial policy, the importance of an end buyer for whatever capacity that we're building out, and utilities to some degree have struggled with that in recent decades, at least. This idea that they have huge investment requirements and while there is clearly demand for electricity and maybe new types of electricity, it's not always certain and you're managing these day-to-day cycles and things like that.

But if we know that AI is booming and we know this is a future area of growth, and we see these headlines like AI servers are going to require like a hundred terawatt hours per year and things like that, does that potentially give utilities more certainty or more confidence in the future investment outlook?

Brian (44:58):
I mean, I would say in some respects it does. I mean, there's certainly, and I've been spending a lot of time with utilities, well for most of my career, but even in the last several months, having this conversation about how are they thinking about this future growth? And they're struggling a little bit because all they know is what the customers show up at their door and say that they want.

They say ‘Well, okay, I talked to X, Y, Z data center and this is what they say they want, but they don't necessarily have a view to the long-term.’ What really is the demand behind that? I'm getting a request because one data center bought one parcel of land and they need 500 megawatts of power. And then they're trying to extrapolate from that, well, what is that underlying demand for data? How much more growth should I expect after that?

And that's where the utilities I think are really struggling is that they can't see much beyond the requests that they have. And so they're trying to then extrapolate, okay, what are these trends? And really the only way to get a good sense of the real demand for data and the trends is you have to actually go back probably to the NVIDIAs and the Intels of the world and go ‘What's the forecast for chip sales? What's the forecast for how many chips you're going to make?’ Not even sales, but really how much they produce because frankly, I think every chip they can produce, it will get plugged in, someone will buy it and it will get plugged in.

So that's probably the best estimate that you can come up with for what utility load growth should look like, at least as it relates to data center. But you have thousands of utilities in the United States, so you don't have, there's not even a single source you can go to say ‘Okay, what's the forecast next year for electricity load growth?’ Nobody has that. I mean, there's numbers out there, but they're not really based on anything other than speculation. So this is the challenge that utilities have is that they don't have a good view into what load growth really is going to look like over the next 5, 7, 10 years.

Joe (47:09):
Brian Janous, fascinating conversation. There's probably like 10 more follow-ups that we could do specifically with you, and maybe one day we'll do them. But in the meantime, thank you so much for coming on Odd Lots. This was great, a conversation that we definitely needed to get done. So really appreciate you joining.

Brian (47:26):
Thank you, Joe and Tracy really appreciated it.

Joe (47:41):
Tracy, I thought that was great. I think actually the first thing that sort of stands out from my mind, sort of working backwards through the conversation is just sort of exactly what you talked about, which is that there is this weird situation where you have this very unpredictable demand. No one knows what the steady state demand is going to be for this stuff, and yet the utilities are sort of legally constricted in the degree to which they can, say, overbuild now or sort of operate or plan for that demand.

Tracy (48:11):
No absolutely. And also, well, going back to the beginning of the conversation with Brian, the idea of a mismatch between just how fast technology is going at the moment in terms of developing AI versus utilities and their 10-year investment programs that they need to get regulatory approval for and all of that stuff.

No, there was so much to pick out from that conversation. I also thought it was interesting, so I think there is a sense among a lot of commentators that there is going to be competition for power at least at certain times. But I thought Brian's point about how in some respects, data centers might be the easy target for politicians to kind of ignore. I thought that was really interesting. And again, his example of, well, if you are a governor or something and there's a Tesla factory that wants energy versus a data center that probably has, I don't know, a handful of employees, maybe that's an exaggeration, then you're going to go with the Tesla factory, right?

Joe (49:13):
Totally. So you're not going to shut down the factory that employs people. You're not going to politically tell people to go without air conditioning on a hot day. The data center is going to be the first target. I thought that was interesting.

Again, I do think it's striking, and I think this is not even just in the energy context, but I still am sort of fascinated by this idea that OpenAI was this company, I think it was founded in 2016, and people saw GPT-1 and GPT-2 and then GPT-3, which came out before ChatGPT. But it was really like that day, I mean, it was like that day that GPT was announced, even though the technology was in development, there are also this theories and stuff. It was like that day of the commercialization of the productization of this technology where everyone woke up at all these different companies. Like we're in a totally different new world, and we have to revisit all of these investment decisions, whether it's on chips or energy that we had made maybe just a year ago.

Tracy (50:10):
Yeah, it's almost like -- bullwhip effect isn't the right term, but I'm just thinking the utilities in some respects are at the very end of that sort of demand cycle, right? So even the tech companies woke up to it very, very suddenly. The boom in AI and how fast this was all going to come about and all of that. And the utilities are sort of the last ones to know in that respect, and we're expecting them to react very quickly to it. It's kind of funny.

Joe (50:38):
The other thing too is it'll be fascinating to see if some of these net zero commitments just have to give or something is going to happen there. It sounds like the rubber is going to meet the road, but it does not sound like in the short-term anyway, that there is a way to accommodate this much increased demand with renewable energy. It doesn't seem like it anyway. And so something is going to, it seems like something's going to have to give.

Tracy (51:03):
I think we're back to the very start of this conversation, which is the idea of we have these two very different paths where, in an ideal world, if everything goes perfectly, you have all this new commercial interest in technology that requires a lot of energy usage. And so some of those dollars get diverted into building out additional capacity in terms of energy and maybe even additional green capacity.

But the other path is kind of depressing, where you have a bunch of big tech companies that feel existential pressure to do whatever it takes to win the AI race, and maybe whatever it takes includes getting energy through coal or something like that.

Joe (51:43):
You know what I think is interesting, and it hadn't really clicked to me, but Brian talked about how after the early eighties, the US basically stopped building nuclear and we're like ‘Oh, it was a big mistake. Why do we stop building nuclear?’ But you could sort of understand it in the context of very little demand growth. So you why make these really big investments in anything when obviously at the time there wasn't as much concern or awareness about climate change and the effects of fossil fuels, and there just wasn't much demand growth, so why make these big things?

And so you think about South Korea and China having never really slowed down on the nuclear construction, but they're also because they're developing countries or poor countries becoming richer, they never presumably had that sort of demand plateau just by dint of having started from somewhere lower.

Tracy (52:31):
You know what we need? We need ChatGPT to design a small modular reactor, and then we need a robot build robot to build it…

Joe (52:38):
And a robot, yes.

Tracy (52:38):
Yeah. All right. Well, it sounds like we're probably far away from that, but maybe one day. Okay. Shall we leave it there?

Joe (52:46):
Let’s leave it there.


You can follow Brian Janous at

@BrianJanous

.