So traditional geothermal is three things coming together. It’s heat, it’s fluid in a reservoir. Those three things come together naturally, right? What next generation geothermal is doing is it’s basically creating artificial reservoirs. A linchpin to making all of this successful is effective implementation and really prioritizing this work within federal agencies.
While American companies lead in AI development, Arnab Dada and Tim Fist, right? There is no guarantee that America will lead in the build-out of the next generation of AI computing infrastructure. To discuss, we have on Ben Della Roca, former director for technology and national security on Biden’s NSC, and Tim Fist, a director at IFP, as well as Arnab himself, director at IFP and managing director at Employee America.
Ben helped write the AI infrastructure executive order, and Arnab and Tim just published a magisterial three-part series exploring data centers, AI, and what needs to happen from a policy perspective to make sure that AGI is not only born and bred, but also deployed through data centers in the contiguous United States. Arnab, Tim, Ben, welcome to China Talk. Thank you. Thank you. Great to be here.
All right. Ben, why was a NSC director spending time in SCIFs dealing with energy permitting policy on federal lands? Yeah, no, it’s a great question. Spending so much time reading about environmental permitting law and watching law school lectures of how the Clean Air Act works was not how I envisioned spending some of my time as an NSC director. But there I was doing just that in the SCIF on my AI work.
So, you know, the way I answer that question, Jordan, it’s like you said at the top, right? We have a lead in the United States on AI thanks to our thriving innovation ecosystem and the immense engineering and other talent that we have in this country that has been driving forward remarkable AI innovation in recent years. But that lead isn’t guaranteed. And I think in particular, who exactly will be leading artificial intelligence is going to come down more and more to where exactly AI is able to be built and where it’s able to be built most quickly and effectively.
And by built, I’m not just talking about the technical engineering challenge of how do you do large scale AI training runs, but the from a computer science standpoint, but also the physical building challenge. That is the need to develop the large scale computing infrastructure and energy infrastructure that running all chips will depend on in order to actually physically execute these ever growing AI training runs.
So maybe just to put this in perspective, what became really clear to us in the last administration was AI’s significant impact across national security throughout all aspects of the economy where it’s being more widely deployed for advancing science. And all these things that ultimately add up to having, I think, a great deal of national security significance, we also saw at the same time how the amount of computing power and energy resources needed to develop models that are at the frontier has been rising exponentially.
So just thinking about the trends that we’ve already seen, right, you have the AI models that are at the frontier in terms of capabilities, the computing power needed to actually train them rising about four to five times annually based on publicly available statistics. So that’s an exponential pace of growth. And even when you factor in increases that are countervailing in terms of increasing energy efficiency of computational resources and things that kind of might cut against the increases in power needs at the same rate, you’re still getting, if anything like current scaling trends continue, something like gigawatts of electricity needed to execute training runs at the frontier within roughly the next few years.
So that’s more or less the main challenge as we saw it. And I should note, too, that it’s worth kind of segmenting this challenge a little bit further in that there’s, you know, as I would say, there’s multiple power related challenges when it comes to developing and deploying AI. You know, on the one hand, you have the need for assuming we see something like current trends continue, gigawatt scale training facilities that would be needed to be developing models at the frontier.
That’s one set of issues. Then there’s a separate but related set of issues around how do you develop a robust network of potentially smaller scale data centers, but in a more distributed way around the country to actually use these tools when different locations in an effective manner. So, you know, stepping back to my role at the National Security Council, where I led the White House’s work around AI infrastructure and developing the AI infrastructure executive order that came out in January 2025, what the executive order really tries to do is it most directly addresses that first challenge I just said, right?
How do you think about these large gigawatt scale training clusters that, again, assuming the current paradigm of AI training roughly continues, are things that we expect will need to be built in order for the United States to maintain its lead? Tall order you set for yourself, Ben. What are the things on a high level in the executive order that you think are going to make the most impact when it comes to building out data identifiers in the U.S.?
So the executive order includes a wide range of things that address, you know, not only the question of how do we bring gigawatt scale data centers online in this country, but also a broader, smaller range of distributed, a distributed network of data centers, you know, around the country as well. There’s a lot of stuff in there, but to pull out a few of the highlights, number one, the centerpiece of the executive order is it establishes a mechanism by which AI data centers can be built in a streamlined and more efficient way on federal sites that are owned by the Department of Defense and Department of Energy.
There’s a huge value proposition to building on DOD and DOE sites because essentially by building on federal lands, you cut through a lot of the, or don’t have to deal with a lot of the state and local land use permitting requirements that usually make data center construction. There’s additional burdens on the federal permitting site that are taken on as a result, but, you know, we can come back to those a little bit later. There’s a lot that the federal government can and will be doing under the executive order to make sure that those processes proceed as expeditiously as possible as well.
The second couple of things I’d highlight are in terms of bringing the power generation online. So obviously operating gigawatt scale data centers is going to require a lot of new power to be added to the electric grid. This is challenging for a number of reasons, though, including not only all the permitting requirements that complicate construction broadly, but also delays with interconnection to the electric grid and other approvals that are required.
So what the executive order does is directs the Department of Energy to establish requirements, basically to collect information and to share that information as appropriate with data center developers regarding the existence of unbuilt power projects, but ones that have already received interconnection approvals to make the power procurement process proceed more quickly. It also directs the Department of Energy to engage utilities to push them to reform their interconnection processes in ways that will make interconnection proceed more quickly.
On transmission, finally, that’s the other big area. You have the Department of Energy directed in the executive order to use some of its very powerful authorities, which we can come back to, partner with private sector transmission developers in building transmission lines much more efficiently and quickly than business as usual and to take part in the planning process as well. You also have other actions to bolster the supply chain for transmission and grid equipment, which would be really useful in the long-term vitality of this industry.
So between those sets of things and other actions to make the permitting proceed as quickly as possible that the federal government has within its authority, the executive order does lay out a pathway for building gigawatt scale facilities on the timelines that we expect the leading developers will ultimately need for AI training. Awesome.
Awesome. All right. So to recap, we’re helping AWS, Amazon, Google, and Azure build giant data centers on federally owned lands and giving a push to a lot of the electricity utilities to like hook those places up once they get these great permits with the juice they need. Tim Arnav, you guys had a great chart in your latest post that gave some check marks to what Ben did and a whole lot of X’s about what this executive order wasn’t able to accomplish that you think are other policy moves that need to happen in the near term in order to unlock these gigawatt scale data centers.
So, yeah. So what’s missing? What’s missing in Ben’s story? I think the executive order that Ben developed really lays out this super valuable framework. The two big gaps that we highlight, sort of the most important things, I’ll talk about two of them, and we can get into the details a little bit later. But the first one is the EO comes along with this clean energy requirement.
So basically all the energy that you’re producing to power these data centers needs to come from clean energy sources, which includes natural gas with carbon capture. But, you know, that is a technology which needs a lot of work to deliver on the sort of timeline that would be required. And I think it’s worth keeping in mind, it’s like the executive order lays out these like super ambitious timelines. It’s like two years to bring like multi gigawatt facilities online. And so I think by having this clean energy requirement, you are kind of compromising the speed at which you can deliver this thing.
So the first thing that we recommend is allowing the build out to happen, at least in the short term, with natural gas plants, which we think is going to be required to actually hit this timeline. And then second, while building on federal lands owned by DOE and DOD allows you to bypass a lot of kind of like state and local permitting issues, it does open up the issue of NEPA, which automatically applies if you’re building on federal land.
So our recommendation around that at a higher level is actually using the Defense Production Act, which you can use both to speed up permitting on federal land, as well as resolve supply chain issues. So Arnab can talk about the specifics here, but basically we think this is a pretty sensible use of the Defense Production Act. The Defense Production Act or DPA, for those who aren’t aware, gives the president broad authority to intervene in the economy where this is seen as necessary to ensure the supply of technology that is deemed essential to national defense.
And our claim is that AI definitely fits within this scope. You know, we see powerful AI systems being increasingly adapted by both the U.S. and Chinese militaries across areas like sensing, surveillance, command and control, autonomous weapons. And because the most powerful AI systems are now being developed by private firms, a lot of the DOD’s future capabilities are likely going to come from models that are trained in data centers operated by private firms. OpenAI recently announced a partnership with Anduril to bring its models to the battlefield.
Scale AI has built a version of Meta’s Llama, which they call Defense Llama, to help with military planning and decision making. Alenteer is building platforms for DOD as well. So yeah, we think this is a sensible use of the DPA and can, as we outlined in the report, and I’ll let Adam go into more detail, we think there’s a bunch of ways it can be used to speed things up. Tim laid it out quite nicely. I think just to be specific about the DPA and what authorities we’re talking about, it’s worth also stepping back that the Biden administration had another previous CEO on AI build out more broadly, in which they did invoke the DPA.
Specifically, they invoked Title VII, which allows the federal government to basically compel companies to offer up some information about what they’re doing. We specifically mentioned two authorities in the DPA, Title I, which is prioritization. So like with respect to the energy build out for AI data centers, there’s a lot of supply chain issues and challenges associated with that. And what this would allow is for the federal government to say, “Contractors, you need to prioritize transformers, turbines, etc. that are going to AI data center use.” And so that would help at least alleviate some of the supply chain challenges.
And then the other authority in the DPA is Title III, which is a financial assistance authority. So there’s a couple benefits of the Title III authority. One is it’s got a very wide contractual flexibility. So like there’s a lot of different, depending on technology, depending on a lot of different factors, there are different financial structures that would make sense to like large gigawatt scale energy build out. And so this allows you to design those authorities accordingly.
And then also there’s authorities within the DPA in Title III, where you can streamline some of these permitting issues, particularly as it relates to the procedural laws that Tim described. NEPA, Clean Water Act has some as well. So we think that’s a good use of this. Let’s talk a little bit about those different Defense Production Act powers because like they’re kind of wild.
The whole skip the line deal. Let’s stay on that one for a second, Arnav. What does that power allow the U.S. government to do? Yeah, basically, if I’ll use a practical example, which is right now, you know, natural gas turbines are basically sold out, right? GE Vernova said they’re, you know, they could be sold out through past 2030, basically. And so even if a company were to contract, they might not be able to get their turbines on the timeline that is associated with what Ben laid out in the EO that was signed by President Biden.
What it would allow the country to do, the president to do is to say, “This is a national security priority, and those contracts need to be fulfilled first.” You know, specifically, it’s called rated orders. But the mechanism basically is that the president can determine that something needs to be prioritized for national security purposes, whether it’s turbines, you know, rank and cycle turbines, transformers, anything kind of associated with transmission build out that would tie into these AI data centers. They can just prioritize it. And that means that those companies need to fulfill those orders first.
Well, to respond to some of the points that were just raised here quickly, I would want to just chime in first on the national security need. I think from my standpoint, certainly agree with the points you laid out about the national security implications and importance of this work. The other thing that I would add, too, is I think it’s also important when we’re thinking about the infrastructure context that we take appropriate steps to ensure that the U.S. does not find itself dependent necessarily on foreign providers of infrastructure solely for accessing and operating these models. So that’s kind of another piece that I would add to the tapestry.
I’d say that there’s a lot of ideas about the Defense Production Act here, and I, you know, I won’t discuss any particular deliberations that occurred under the prior administration in any depth. One thing that I will say is I think that in the prioritization of different grid equipment, for instance, you know, there’s a lot of times that this has been done by the Department of Defense using the Title I authority that can be examined. But I think that broadly speaking, it’s worthwhile to do a full examination of what the potentials and options are with the DPA at large.
I think that one thing that I’d be curious for your further thoughts on, Arnav and Tim, as well here, is on the preemption of environmental laws, which you all mentioned here and you wrote about in your piece. I think that just sharing my personal views on this, and this is certainly not something that is legal advice by any stretch. I think one question that I would have with some of these authorities in Title I or Title III is in order to get the preemptive effect that is sought, there would have to be some sort of direct conflict with the environmental permitting regulation.
And so one way to deal with that is potentially with the language you just cited, Arnav, the “without regard to other laws” language that’s available for certain authorities in Title III. I agree that that language seems quite powerful and is similar to other authorities that have been read in other statutes to preempt certain permits. Though I think that there’s other limitations on, for example, the spending authorities that have to be used in conjunction with that provision that may be challenging to design a program around.
And that’s at least something that will require further thought to iron out the details if that were to be exercised in the future. And for the provisions of law without that, for the loan and grant provisions that don’t necessarily have the “without regard” language, I think that one question is how would, if this were an option to be exercised, how would a loan be structured that would actually create a direct conflict that is direct enough to rise to the level of preemption under existing legal precedent? I’m not saying that I know the answer to that question. I think that folks would need to look at this more closely to figure out the answer, but I think that there’s a genuine legal question around some of the specifics here.
Yeah, I guess I’ll start from the point of view of the “without regard” clause, which is one mechanism. I think it’s probably the most broad, but I don’t want to say extreme, but maybe like the most aggressive, assertive use of the DPA here. It does only apply to the lending provision in Title III. So it’s only for the lending authority that the “without regard” clause applies. So it’s not all of the financial means of support.
I would say there’s a couple of other provisions. There’s the emergency provisions that were part of CEQ’s regulations until yesterday, I think, when it was, or I guess today, when that was rescinded. And, you know, there’s also protections for classified information. And so what I would say is like, I don’t want people to think that our idea here and what we’re proposing is like getting out of environmental rules to build this stuff.
The important thing here is like, we’re talking about using these to streamline the procedural laws that are associated with environmental review. So it doesn’t even necessarily mean that an environmental review wouldn’t be conducted prior to, you know, leasing this federal land, for example, if we’re using the example from your EO. But I would say finding ways to limit potentially the likelihood of litigation could be very important here. And that’s where some of the national security exemptions are most useful.
And just to chime in really quickly there, I fully agree with that. I think that’s a really great framing of the question here, which is, you know, how do we streamline the procedural aspects while still ensuring that there is a full and valuable but expeditious review that takes place? Picking up on your point about the emergency exemption aspect of NEPA, I think it’s important to underscore here that that was a product of, you know, solely CEQ’s, I suppose, former regulations. It was their interpretation of the statute.
It is not something that there was necessarily specific statutory language speaking to. And what the result of that is, is that the Trump administration is obviously doing a full review and making revisions to how NEPA will be conducted across the federal government. One question to ask might be whether there is a more appropriate or on-point way to think about what an exemption for national security or similar emergency situations might look like that more clearly applies as appropriate to this sort of situation.
The emergency circumstances language that was originally there, I think, was actually, I would argue, seems in my view to be a little bit narrower than might be able to most obviously apply here. So there may be utility in thinking about what is the appropriate scope of alternative arrangements for NEPA in these types of situations.
Yeah, I mean, I think let’s just take a step back here because we got in real deep, real fast. I think, like, the renewable stuff obviously going to disappear in the next few weeks. NEPA, you know, it was just deleted by the Trump administration. I think, like, the sort of worry if I’m a hyperscaler is if I drive my data center through it and then, you know, maybe I get sued four years from now in, like, a Warren administration or something.
But, like, even then, I mean, the data centers are already going to be built, like, we’ll have AGI. You’re not going to get an administration that’s going to shut that off because you took advantage of some, you know, a permissive regulatory environment. But the part that I think was really interesting that both of you had pointed to beyond this sort of permitting stuff are the hard constraints when it comes to actually building the electricity and deploying the electricity that you are going to need in order to build these data centers in the first place.
And a fascinating part of Tim and Arnav’s work was going through all of the potential technologies that could give you the marginal amount of electricity that you need in order to build these data centers and sort of like stack ranking them almost in terms of like what their potential is.
Yeah, let’s do overrated and underrated. What are three overrated sources of electricity that get way too much shine in the broader discourse when it comes to the future of AI? If I can start, Jordan, I’m just going to tell you, I don’t love the word overrated, but I think there is sometimes an implication that, like, natural gas just makes this super easy.
And the reality is that, like, gigawatt scale energy projects, you know, particularly if we’re talking off the grid, which a lot of compute companies are moving towards. If we’re talking about off the grid, meaning not connected to the, you know, transmission system, so, like, off the electricity grid, you’re just building it, you know, close to your data center, and it’s just powering your data center. That is a massive, massive investment, regardless of how much cash you’re sitting on or you might have.
Like, it is very, very expensive and risky to build energy infrastructure at a gigawatt scale. And when you talk about natural gas, like, yes, it’s a proven technology, but there are, you know, like, stranded asset risks if you can’t interconnect and, like, an AI data center, like, outlives its useful life in that location. And if there’s, you know, something that becomes more cost competitive and, you know, a company wants to switch. So, like, I think that natural gas is really important and it should definitely be part of the mix.
But, like, I don’t want to say that, like, natural gas is just an easy decision here because even that’s difficult and there’s supply chain challenges. And I think sometimes people are just like, we can solve this with natural gas. It’s not that easy. All right, Tim. Yeah, so I think the obvious one here is, like, to go to the much more long-term technologies fusion.
So, we see, like, some hyperscalers have been signing power purchase agreements for fusion energy. So, a power purchase agreement is, like, a commitment to buy a fixed number of, like, kilowatt hours at, like, a future point. And, yeah, the fact that they’re buying this for fusion is kind of crazy when there isn’t, you know, a viable commercial fusion reactor that’s ever been demonstrated.
Yeah, this is one that is, like, clearly so far off that it’s just, like, not going to matter. But over the kind of timeframe that we’re worried about, which is, like, how do we ensure that we can build this stuff over the next five years? I’ll also push back a little bit on the straight-up overrated framing. But I think that, you know, a really important lesson of, you know, looking at this problem space very in-depth and examining what are the different energy options that could be brought to bear to solve this challenge is that, you know, and Arnab and Tim made this point, I think, very well in some of their pieces, is that each way that you could go has real downsides.
And there’s real risks and problems and inefficiencies that are not fully recognized. So to pick all that said, you know, to pick one where I think that there is sometimes optimism that I would think should be qualified at least a little bit, overrated, if you will, is, you know, we need to be realistic about the nuclear options in particular, which nuclear energy is something that I would be very excited about on a longer-term timeframe, such as, you know, the 2030s.
I think it is very difficult to see, you know, that being part of the solution for some of the late 2020 challenges that we may face in terms of AI’s energy needs. So that’s one piece; you know, Tim addressed fusion, but nuclear in general, I think that is worth being realistic about the timing involved.
Well, I was just going to say, like, a broader point to make here is that we’re at this kind of weird point in history, right? Where, like, 20 years ago, the answer to this would have sort of been obvious because, like, renewables were just, like, completely not viable for doing this and there weren’t some exciting next-generation technologies coming along. Right now, we’re currently at this point where a bunch of technologies are all like approaching the same point of cost competitiveness at the same time.
So, like, large-scale battery plus storage is now sort of, like, better than natural gas in a bunch of areas. Advanced geothermal is becoming, like, super interesting, but it hasn’t probably been scaled or demonstrated yet. Small modular reactors are, like, just coming online or are probably, like, the next big thing once we can scale them up. But at the moment, like, in the near term, it just seems like natural gas is kind of, like, the obvious solution, but then it’s going to become an obsolete technology within about 10 years.
So that’s the kind of core problem that we try to grapple with. Yeah, I was bummed out by dams. I thought that could, like, be a thing, which you guys disabused me of that notion. They’re too slow and they’re not big enough. And I don’t know. There’s no new, like, awesome dam technology that’s been made over the past 70 years, which I was a little disappointed to read.
All right, Arnav, what are you excited about? I am incredibly excited about next-generation geothermal energy. So this is geothermal energy, energy produced from the Earth’s crust, from heat, you know, the heat beneath our feet, as it’s been called many times. And basically, we are pioneering this energy technology because of our experience with fracking and how we pioneered drilling techniques that led to the shale revolution.
And this is a place where, you know, there hasn’t been enough demonstration yet. There are some companies that are innovating quite fast here. But the scale of it is, like, really remarkable. And it’s a place where the U.S. can really lead because we have an oil and gas workforce where 61% of the workforce has skills that are directly transferable to geothermal. We have a supply chain for fracking and shale production that is ready to go and transferable to next-gen geothermal if we can get there.
And so this is something I think the potential is just incredibly high. And I wish we were doing more to support it. Can we stay on the technology for one second? Like, how do I drill a hole and get electricity out of it? It’s basically what you’re doing is you’re drilling into the Earth’s crust where there’s a lot of heat, and you’re pumping fluids down into that heat. It’s getting heated up, and you’re circulating it back to, you know, a steam turbine that is then, you know, being used to produce electricity.
That’s the simple explanation. So it’s just like a steam boiler with the Earth’s core as the power source. That is incredible. Yeah. And, you know, I would say there’s multiple kinds. Like, the reality is, I mean, there’s EGS, enhanced geothermal systems, where, you know, you’re essentially fracking to create these like natural reservoirs. So traditional geothermal is three things coming together. It’s heat, it’s fluid in a reservoir, right? And you’re taking advantage of that. What next generation geothermal does is, traditional geothermal, those three things come together naturally. So these are natural geothermal reservoirs. What next generation geothermal is doing is it’s basically creating artificial reservoirs.
You’re digging, you’re fracking to create cleavages in the crust, and then you’re cycling fluid through it. This is safe, to be clear. This has been tested and demonstrated. This isn’t something that’s going to blow up the Earth. But that’s kind of the simple explanation for how it works.
Yeah, but the Pacific Rim monsters are going to be fine with us? I don’t know. I guess we’ve already dug enough holes for fracking. What’s a few more, right? Just to highlight some exciting stats on this: if you look at the numbers here, the amount of heat energy stored in the Earth’s crust that you can access via enhanced geothermal vastly exceeds the amount of energy in all known fossil fuels by several orders of magnitude.
This is an abundant source of low-carbon energy without any of the intermittency problems of solar and wind that you can also access using a lot of the same tools that we’ve developed to do large-scale fracking as well. This stuff has already been deployed. Google is powering some fraction of its data centers with this as well.
At the moment, it’s just a scaling problem. How do we get the tens of megawatts that are online in the United States at the moment to hundreds of megawatts? The nice thing is, the areas where you can get the most heat out of the Earth’s crust using these methods also overlap substantially with all the areas where you have federal land that can be readily leased. So it’s kind of this perfect recipe for solving this problem.
What, the Earth is warmer under Nevada? The heat is higher. It’s hotter higher in the surface, basically. If I could also just add one thing to what Tim said there, just as an example, you have to drill three wells to produce about 10 megawatts of energy in something that’s called a triplet. To get to 5 gigawatts, which is our goal that we set by 2030, you would need to drill 500 of those triplets.
And that means basically 1,500 wells, right? We have drilled 1,500 new wells multiple times in the shale region of this country, like in a given year. This is just not that many new wells to drill if we can perfect it. I think this is something we’re very poised to take advantage of if we can get there.
I want to do lessons, policy lessons from the shale revolution. But, Ben, did you want to say something before that? I would just underscore that geothermal is the single energy source that I am most excited about in terms of things that I think are underrated by the broader public.
I also think that AI itself has the opportunity to provide a lot of the backstop sources of demand that can funnel capital to the industry and really incentivize development and some of the technical advances needed to make the United States really a global leader in this technology and advance our energy leadership more broadly.
Ardab and Tim did a great job highlighting the many reasons why this is such a promising source of technology. Just to highlight what I think the executive order has already set in motion that can really move the needle here, because policy choices are going to be really critical as to the exact time frame for bringing this technology online and making it a big part of the AI solution.
As Tim mentioned, the places where traditional geothermal resources are available is really out west. The western United States overlaps heavily with places where there are large amounts of land owned by the Bureau of Land Management, for instance. One of the problems with building geothermal projects in federal lands has been federal environmental permitting reviews, which take time.
The executive order has directed the Department of Interior to find ways of doing those reviews much more quickly. So eliminating redundant reviews at multiple places in geothermal projects and also creating what are called priority geothermal zones, where the Department of Interior will focus its permitting efforts to move permitting along as expeditiously as possible.
Those zones are places where I would argue we would want to have those overlap with places where AI data centers are being built to make sure all of this is moving in the same direction. I think there’s a lot more to be done, but there’s a valuable starting point that we’ve seen to really accelerate some of the development in the geothermal space.
What is the environmental consideration? I mean, we’re not even at a point where we have oil gushing out. Are there endangered species 20,000 feet below the Earth’s surface? Who cares? It’s just like 10 guys in a drill.
It’s a great question. And certainly, the environmental repercussions are fewer than traditional fracking for the oil and gas sector. With any sort of construction project like this, you have to literally build a power plant, and that involves some changes to the natural environment. If there is an endangered species where we want to build the power plant, that will be a factor in the environmental analysis.
There are impacts from drilling deep down that can cause impacts to the broader region as well. But the environmental burdens are a lot less. That’s the reason why, in general, there’s potential for permitting to go more quickly. It’s a question of marshaling the right policy resources to ensure we are moving as quickly as possible, given some of the lesser concerns with this technology.
I wonder how many endangered species will have left once the administration is done with us. But anyways, Arna, what are some lessons from the shale revolution that can potentially apply to the U.S. government helping incentivize the development and production of geothermal?
I think I should level set and say we in the 1970s, arriving out of the Arab oil crisis, made a very conscious effort to support nonconventional production of energy. By the mid-2010s, we were the leading producer of oil and natural gas. How did that happen?
I think there were four policy interventions that occurred over those decades that I would zero in on. My colleague Skanda Murnaf and I wrote about this last year. The first thing was there were a lot of research and development and cost share programs to innovate in drilling and develop these new techniques. That was really important.
You had the Department of Energy working directly with Mitchell Energy to own some of the costs of drilling and testing these nonconventional means of production. The second thing I would point to is supply-side production tax incentives and demand-side price support. On the supply-side production incentives, there was a Section 29 tax credit, a production tax credit from nonconventional sources.
An analogy now is the Inflation Reduction Act, which is supporting a lot of production for new types of energy. It’s important those credits stay in place. On the price support side, you had targeted deregulation in the Natural Gas Act in 1980 that exempted energy produced from nonconventional sources from price controls that were in place.
The third thing is permitting changes. The regulatory environment changed in 2005 with an Energy Policy Act that established a legislative categorical exclusion. A certain type of production with a specific geographic footprint could go through the lowest level of NEPA analysis and get approved.
The fourth thing that goes underrated, but is essential to the shale revolution, is a very accommodative macroeconomic environment. People often overlook this, but the shale boom and increases in productivity happened at a huge scale in the late 2000s and early 2010s when interest rates were low. Companies could take cheap debt and iterate, with a lot of capital available for them to build productivity enhancements.
These four things highlight what happened in the shale revolution. We need to figure out how to compress that timeline for next-gen geothermal. So how far away are we today from these awesome steam boilers?
Fervo right now is building a 400 megawatt facility. I don’t know the exact state of the project development. It’s a private company relying on public information, but they’ve demonstrated that their technology works at a small scale. Tim mentioned a Google facility; one of their smaller facilities is powering a data center around 40 megawatts.
There’s a company based in Calgary, Canada called Ever that is also executing a similar project. I think it’s really just a question of whether we can get to scale. The big question for these companies is whether they can secure enough capital to demonstrate that the technology works and can produce utility scale.
You mentioned capital. Where hasn’t this been coming from and where should it be in order to realize this vision over the next five years for geothermal? One thing we talked about in our report is the challenge of financing these next generation technologies.
There’s a tremendous amount of uncertainty associated with developing these technologies. You need to get people comfortable with that level of uncertainty that comes with project development. They need to be okay covering the costs when they go higher because a permit takes longer than it should, or there’s a supply chain snarl. Higher interest rates also complicate things.
These different kinds of uncertainties add up. Equity investors are reluctant to invest because, at sufficient scale, there are only a handful of venture capital firms that focus on this type of investing, and they tap out quickly. You can’t get banks to do it because the risk of failure with that uncertainty is very high.
The government is playing a role in financing this. If you take a look at the Office of Clean Energy Demonstrations, they have funded demonstration projects for small modular reactors. In the shale context, the federal government was sharing costs with Mitchell Energy to drill. We need some version of that.
We also need the federal government to reduce that uncertainty. When you think about technology demonstration, there’s a concept paper to get investment for small-scale projects, and then once you get investment, you want to move to bigger scale. It’s the slow incremental process of gaining investor comfort.
We need to compress that timeline and lower uncertainty at each stage so people and companies can invest. Financing energy projects typically requires three things: debt, equity, and offtake agreements, meaning someone to purchase it once you are producing it.
Right now, we see a lot of headlines about AI companies investing in energy projects, but they are mostly doing it through power purchasing agreements. One thing to understand with next generation energy is that there is a tremendous amount of uncertainty. It comes from things that are not quantifiable, like permitting and regulatory timelines, and physical feasibility, like material bottlenecks that can emerge.
One of those three participants needs to own that uncertainty. It’s generally not going to be offtake agreements. You need very high premium offtake agreements to cover a level of uncertainty that would incentivize a debt or equity investor to feel confident that their investment won’t fail.
If the federal government can own that uncertainty by putting up some capital or if it can reduce it through streamlined regulatory procedures, it could unlock the capital these companies need to invest in projects upfront. Right now, there isn’t someone putting up that capital upfront, and that’s the barrier we are trying to solve.
Many tech companies are sitting on a lot of cash, but they’re not putting it into energy investment. We believe that federal government authorities can reduce or own that uncertainty, which might encourage companies to directly invest. I should say one company, Amazon, has invested directly in an SMR project in the Pacific Northwest, and that’s a model we would like to see happen more at a larger scale.
So, I do like this. You gave me some great new acronyms, Arnav. We have FOAC, SOAC, THOAC, and NOAC, first of its kind, second of its kind, third of its kind, and nth of its kind. We’re still in the first of its kind universe.
Let me push back on this hype train a little bit. In three years, we’re going to go from some cute demonstration projects. I take your point on figuring out the pumping mechanism and whatever, like you have a lot of people happy to live in Nevada for a while and drill big holes in the earth. But I don’t know. AGI is coming soon; is this really going to get us there?
I think this underscores the validity of the all-of-the-above energy approach where we want to take multiple shots on goal. We don’t want to put all our eggs in the geothermal basket. If we had this staging approach for different technologies that could work well under this strategy, we could deploy natural gas plants first because we know those can come online quickly and provide secure, reliable energy.
Solar and battery storage are promising. Building out geothermal should occur alongside that. Small modular reactors come after that as well. If you want to start thinking about fusion, that could come online in 20 years. Basically, we should invest in as many of these technologies as possible at once to address this technological risk with the next generation stuff.
I also think within the geothermal space, we can discuss this layered or staged approach where we lean more on some geothermal technologies at one point and then on others. Traditional geothermal technologies are more tried and tested.
We’re not necessarily doing first of a kind projects with hydrothermal resources that have been in place for many years. Those may not deliver a huge amount of gigawatts onto the grid, but they could still contribute meaningfully by 2028, especially if we encourage exploration and resource confirmation as data centers are being built.
Beyond 2028, it might be more realistic for enhanced geothermal projects, which are still in the first of a kind stage to come online. There’s definitely a phasing that we can do that way. Tim makes an excellent point that each of these approaches has different strengths and weaknesses.
It’s unrealistic to think that any one energy source will solely address AI’s energy needs. Combining solar and batteries in particular can be a way to access firm power. There are downsides to scaling, but it’s also faster to build solar plants. There’s been work done to review the environmental impacts of solar development in some Western lands under government management, which could speed up some projects.
Folks should really look at a wide range of opportunities and leverage different site-specific opportunities. Can we talk a little about transmission lines and transformers? Arnav, you mentioned that a lot of this may end up being off-grid where Google’s responsible for building the power next to its new data center.
To what extent does hanging transmission lines over farms actually matter for this stuff? I think transmission is a big part of the equation. You could imagine a world where all the power resources are co-located, and we don’t need to transport any of it. That’s theoretically a solution, but it’s unlikely that we’ll find sites for gigawatt-scale data centers where we won’t need transmission lines.
At a minimum, having transmission lines provides multiple benefits. It puts a larger range of resources in place. If you build a data center around various energy sources, you can tap into more of them. Even with onsite power generation, interconnecting that power to the electric grid offers stability benefits, reduces the need for microgrids, and mitigates some financial risks because if you use less power than expected, you can resell it onto the grid.
Transmission lines are important regardless. I want to highlight a couple of things from the executive order that could help build transmission infrastructure for AI data centers. The Department of Energy has important authorities in statute to address these problems. One relatively well-known authority is the ability to establish natural interest electric transmission corridors, which allow the Department of Energy to accelerate certain permits if they are taking a long time and impeding efficient development.
That could be useful in the longer term, though establishing a NITSI takes time. Another set of less-known authorities should really be explored. The Department of Energy can partner with developers of transmission lines, which can be powerful. There are several statutes that allow the Department of Energy to create public-private partnerships with companies to upgrade and construct transmission lines.
Past analysis from the Department of Energy has suggested that some of these authorities might bypass lengthy state approval processes and enable more efficient cost allocation and processes typically handled by state public utility commissions, which can take years to complete. Using these authorities more proactively could provide a faster pathway to building transmission lines, especially shorter ones, to connect data centers to the grid.
We’re not talking about giant transmission projects, but targeted transmission builds that the DOE can work on with the private sector could be a pathway to putting more gigawatts on the grid by 2028. Arnav, anything to add? I would second everything Ben said. I think transmission is really important, and the short-term solutions Ben identified are promising.
The reason firms are moving off-grid is that transmission is just so difficult. This highlights the need for longer-term reform, and permitting reform is needed from Congress. There’s only so much that can be done, and it’s often more imperfect through the executive branch.
Some stats on transmission are shocking. It takes an average of 10 years to build a new transmission line in the U.S., largely due to permitting holdups. Ten years ago, we built 4,000 new miles of transmission lines annually. Now it’s about 500 miles. This is a factor of 20 decline.
This is a hard problem. Ben, what’s a transformer? Why does it matter? Conceptually, a transformer transforms the voltage of an electric current. This transformation process is essential to deliver electricity from transmission lines or higher voltage environments to voltages that an end-user facility can accept. We need transformers within electric infrastructure to bring it to the uses we ultimately want to power. The challenge with the transformer industry, though, is that there’s limited capacity, at least with current resources being allocated to transformer development to increase or supply an adequate amount of transformers that we anticipate would be needed to build all of this power infrastructure that AI is demanding.
And so I think that there’s a lot that can be done to support the transformer industry in terms of loan guarantees or other financing options that can be provided or encouraged by the government. Basically, to allow the transformer industry to invest in the capital expenditures needed to expand our facilities or to train a new workforce and add new workers to existing facilities.
All these things are going to be important for bringing down transformer lead times, which can, I believe as of now, are in the two and a half to three year range. But there’s certainly an important part of the supply chain aspect of this problem.
Since we were talking about the Defense Production Act earlier, now that corruption is in and FARA is dead, how far can a president who really just wants to potentially let their main consigliere, who happens to be building giant AI administrators, just get all the gas turbines before everyone else? I mean, is there any recourse here for something crazy like that?
Look, I think broadly with DPA use, it is important when you’re utilizing any legal authority that’s pretty aggressive. I think it’s an assertive use of legal authority. It’s good to try to get the political buy-in, too. Even if you believe that the legal authority is bulletproof and you can do what you want, you still want political buy-in.
I think, at least, the fact that Ben’s here and talking about how the Biden administration really prioritized this because they thought there was a threat. The fact that the Trump administration thinks that this is a threat and that there should be a national security aspect to AI data center build out. There is some consensus.
And so that’s one thing I would say, though, is that the DPA is up for reauthorization. And that’s something that happens typically in a bipartisan fashion. It’s a place where, if you want to take advantage of that consensus and say that we are appropriating X amount of dollars and X authorities, if you want to add to it, even, for the DPA to be utilized to help our energy infrastructure build out for AI data centers.
Like, put some safeguards on it. If Democrats are really concerned about the hypothetical that you just gave, that can be a negotiating chip for what is typically a bipartisan reauthorization. And so I think I would say, generally, if your concern is around corruption with this stuff, there is an opportunity here because the DPA is up for reauthorization.
And there were some things that happened in the Biden administration with respect to using the DPA for heat pumps that Republicans didn’t like. And there were hearings last year on this. And I think generally it’s just something to understand that building some kind of legislative consensus could be useful here.
I think Arnav has made a lot of great points. And one thing I’d add is that in this issue space, I think the role of litigation shouldn’t be underappreciated in terms of the type of check that it can play. So this has long been the dynamic with infrastructure projects in many different industries.
But the way that litigation works around the National Environmental Policy Act and other sorts of permitting-related statutes is that you have litigation that can basically allege that permitting requirements haven’t fully been fulfilled. And that can result in injunctions that ultimately delay projects while court proceedings are ongoing.
So, in effect, what that means is there’s a really significant value to making sure that all of the T’s are crossed and I’s are dotted when it comes to pursuing an option that uses national security authorities. Because if you do something that goes outside the bounds of the law or isn’t necessarily an airtight legal case, then the odds of litigation go up.
And that can ultimately result in projects being delayed. And if you can’t get past the pre-construction stage, because you’re dealing with lots of litigation, that can have a huge consequence in terms of whether you’re able to build artificial intelligence. And time is really, there’s a real premium to it in this space.
So, I think that the need to make sure that the laws are really followed very closely here for a wide variety of reasons shouldn’t be understated. Yeah, I mean, I think a lot of the dynamics that you guys just pointed out also apply to all the NEPA stuff we were talking about, because it’s like, yeah, Ben was doing some cute things here and there to try to make it easier for the firms.
And then Trump goes and cancels all of it, which is legal, maybe, I don’t know, we’re going to find out. And that is a kind of, on the one hand, it’s probably pretty exciting for Google and Amazon. On the other, you’re opening yourself up to this whole new sort of legal attack surface where you were not, if we were living in a Harris administration, that followed the direction that the executive order played out a little more directly.
Anything else to add on that dimension? I just say really quickly, I think your point about the uncertainty here is exactly right. Trump’s rollback of NEPA, as it has existed for years, certainly has the potential to speed things along, but it doesn’t ultimately get rid of the fundamental statutory requirement, which is for agencies to essentially do the best they can to review the environmental consequences of their actions.
And absent regulation, there’s going to be a huge amount of ambiguity and uncertainty as to what that means. And there will still be years of past practice that courts may look to, to fill in what exactly the content of the statutory requirement is. So I think that the actual magnitude of the impact from the efforts to rewrite NEPA regulations really remains to be seen.
It may take years to play out. One quick thing to add to that is that, like, CEQ, the NEPA regulation that passed by the Council on Environmental Quality was rescinded. But every agency still has its rule in place for how to conduct a NEPA analysis, and those are still in place. So right now, that’s the rule of the road.
And so I think there’s a long-term uncertainty that Ben is right to talk about here. But in this immediate moment, the regulatory framework is still existent for agencies. And that’s important for people to know and continue to comply with.
All right, Tim, we’re talking about all these goodies we’re going to give these hyperscalers from helps for funding their energy sources to these breaks on environmental controls. But you wanted to throw a little hook to make them take AI security more seriously. What is the market failure here? And what are the sorts of things you think the government should add on to their requirements in order to get all of these special dispensations?
Yeah, great question. So, as you mentioned, one of the core claims that we advance in the report and that I think is well represented in the previous administration’s executive order is that there’s this market failure around AI security.
So, what does this mean? As I’m sure many guests on your podcast have sort of repeated a lot, U.S. companies and U.S. AI companies are currently building models that they think within just a few years could be used to reshape the global balance of economic and military power.
So, think AI systems that can autonomously carry out massive cyber attacks or automate the process of scientific R&D or serve as substitute remote workers for many kinds of jobs. If this is true, then we really need to be protecting these systems against theft by bad guys. And it turns out that a lot of the security problems that you need to solve to do this are at the data center level.
But protecting against sophisticated threat actors like nation-state level hacking groups is both really hard and really expensive. And if a company invests in this to the adequate level, they risk falling behind everyone else who isn’t doing this. And so the idea that we advance and sort of, like, a core part of the executive order that exists is to tie this assistance around both loans and permitting assistance to strong security requirements.
So, create a set of security requirements that labs can follow, that hyperscalers and AI companies can both follow, that raises the level of security protecting their critical IP so that they can actually protect against these kinds of threats. By tying it to this assistance, you turn this into something that’s currently going to put you at a disadvantage relative to your competitors into something that just looks like a really strong commercial decision.
We outline a bunch of ideas for what this could look like specifically. I think it looks like finding the best existing standards and applying them across the board. It’s coming up with new standards and guidance that are specific to the threat model for attacks on AI model weights specifically. And then it’s probably a lot of government assistance in doing things like supply chain security, physical security for AI accelerators, background screening for personnel, protecting against insider threats, counterintelligence playbooks, this kind of thing.
But yeah, the basic idea is to enter into a strategic partnership between the government and the AI industry to level up security, but have a carrot on the other end to make it all worth it. The piece that really struck me in reading the Dario article about his world in which America gets ahead and we accelerate faster than China and create Dr. Manhattan and get to rule the world or something, is that if we’re in that timeline, the incentives for the Chinese government to just leave these data centers alone falls to basically zero.
All right, if you can steal the model weights, then maybe you want OpenAI and Anthropic to continue existing to make cool stuff that you can take and deploy. But you get into the technology, which is to rule all technologies, you start to get into a U.S.-Iran dynamic where stuff like Stuxnet or drone attacks come into play.
What if there’s just a giant leak that fries all your servers? The physical security side of these tens, hundreds of billions of dollars that are getting thrown into these data centers is something that hasn’t really been talked about a lot. But it’s easy to see the potential future where that ends up being a core part of what the U.S. government and the firms themselves need to start focusing on to keep safe.
Yeah, and just to respond first to the difficulty of protecting model weights, I agree, this is a hard problem. I think I’m more optimistic about protecting models than you are, but with the caveat that we need to think about the scope of things that it’s useful to protect.
To be more specific about this, I expect that over the next few years, the most powerful models developed by U.S. Frontier Labs are going to be deployed internally first. There are kind of three reasons for this. First, as capabilities grow, there will be a bunch of misuse concerns that labs are going to want to address before they deploy widely.
The second is that deploying internally before you widely deploy makes a lot of technical and economic sense, as you can use the model to help accelerate your own R&D internally before releasing it more widely. Then last, it makes a lot of sense to first train the big expensive model and then distill it down to a version that’s more economical to serve to users, which you can release first.
This is reportedly now common practice across basically all the Frontier Labs. So if this is true, then protecting models at the absolute bleeding edge can be done in a more favorable security environment where the attack surface is relatively smaller because you’re initially only deploying for internal use cases.
And so, yeah, eventually, I think these models will get stolen, but protecting the bleeding edge from being stolen immediately is still going to be worthwhile as it allows you to use those models to keep your overall lead through investing your inference compute into AI research and development, and use those models to develop things like AI-powered cyber defense, etc.
I think there’s a lot of hand-waving in this theory of victory. But to be clear, there’s a lot of unknowns here, but I think seriously trying to predict this stuff is worth it, as the alternative is just freely handing it over to China, putting all this money into power and chips and then handing the products freely to China, who can use your powerful model to accelerate their own AI research programs.
And then, yeah, preventing these denial-slash-sabotage operations is also a worthwhile goal. I’ve seen some interesting research recently about the susceptibility of current AI data centers to cheat drone strikes, as well as attacks on surrounding network and energy infrastructure. I don’t have a view at the moment about how expensive this will be to defend against, but I think it certainly needs to be a huge part of the investments in defense that need to happen.
The one thing I’ll say is, look, if America is going to win, it’s going to need PRC nationals working in these labs. And if we’re doing FBI counterintelligence checks on every AI PhD Berkeley graduate, I’m sorry, we’re just not going to have an AI ecosystem.
So, look, there’s some middle ground there, but I don’t know. That’s the one piece that I was most skeptical of. Yeah, so financing energy projects typically requires three things. It requires debt, equity, and offtake agreements, meaning someone to purchase it once you are producing it. Right now, you see a lot of headlines about AI companies investing, I’m using air quotes, in AI energy projects.
But they’re mostly doing it through those offtake agreements, through what’s called power purchasing agreements. One thing to understand with next-generation energy is that there is a tremendous amount of uncertainty. It comes from things that are not quantifiable, like permitting and regulatory timelines, physical feasibility, and material bottlenecks that can emerge.
There’s just a ton of uncertainty, and one of those three participants needs to own that. It’s generally not going to be offtake agreements. You have to have a very high premium on offtake agreements to actually cover a level of uncertainty that would incentivize a debt or equity investor to feel confident that this project, their investment isn’t going to fail.
One thing we really try to put out in our report is that if the federal government can either own that uncertainty itself by putting up some capital and owning some of that uncertainty, or if it can reduce it in the form of streamlined regulatory procedures, what that could do is unlock the tremendous amount of capital that is at these companies so that they can directly invest in projects up front.
Because right now, there isn’t someone putting up that capital up front. And that’s the barrier that we’re trying to solve. A lot of these tech companies are sitting on a lot of cash. They have a lot of cash on hand, but they’re not putting it up front into energy investment. We think that by federal government authorities reducing that uncertainty or owning that uncertainty, you might be able to get companies to directly put up capital.
I should say, one company has. Amazon has invested directly in an SMR project in the Pacific Northwest. That’s a model that we would like to see happen more and at a bigger scale. So that’s what we’re trying to do.
I had one sort of random question. There was this very funny chart that Tim and Arnav had where, I think it was Google, Amazon, and Microsoft, they’re all committed to being net zero by 2030. And they’re on this trend line and it just starts to go the wrong way once they realize they have to build tens of billions of dollars of data centers.
Do those commitments just go away in our anti-DEI world? Is there anything statutory about it? Is Blackstone going to get mad at them? I mean, what’s the forcing function here that would keep them on those trend lines absent some really amazing geothermal breakthrough?
I would say that, you know, I wrote about this recently. The way to get more adoption of these newer technologies that are firm and emissions-free is for them to become cost-competitive and quick to deploy. I don’t know how firm the commitments are from Amazon and Google. I don’t know how sticky their internal social costs of carbon are.
That’s not for me. I’m trying to think about policymakers and what we can do to get to that place to reduce those costs. I think that it is real, but it’s probably not going to stop a company. Even putting a coal plant online isn’t going to stop a company if they know they can get AGI first. Right?
I think it’s our jobs to figure out, if you care about climate change, if you care about decarbonizing, it’s our jobs to figure out how to make that happen as fast as possible.
Yeah, agreed that making these energy sources affordable is the best way to ensure they’re adopted. I think that the related piece of that, though, is making sure that the timeline to actually permit them and bring them online is efficient and fast as well.
In some cases, if you have the scenario where clean energy technologies or emerging clean energy technologies can be brought online more quickly than less clean sources, so that the permitting timelines are actually shorter for those technologies. That can provide a strong incentive for industries such as AI to choose the faster route in terms of permitting time because there’s a large financial premium they could earn from bringing their AI models online and operational, say, six to 12 months earlier.
There was a lot of work in the last administration to set forth actions that will address AI’s energy needs, and we’ve discussed a lot of potential ways forward in this discussion as well. I think it’s really critical to remember that a linchpin to making all of this successful is effective implementation and really prioritizing this work within federal agencies.
Ensuring that folks are focused on completing this work effectively, fully, and quickly, and making sure that the work starts on time and proceeds according to a schedule is going to be extremely important. I think that, Tim and Arnab, in your paper, one of your recommendations was for the White House to have an AI infrastructure czar of sorts to kind of oversee and spearhead this work.
This work is ultimately very complex and it’s very interdisciplinary. It combines, it’s not just a national security challenge, but it also includes a lot of energy, energy policy, law, and environmental permitting law. It will require strong leadership from the White House and from the federal government to make sure that things happen as they’re envisioned.
So I think that’s one simple but important point to underscore: the implementation side of this really matters. Arnab, Tim, thank you so much for being a part of Chinatown.
So traditional geothermal energy consists of three key components: heat, fluid, and a reservoir. These elements come together naturally. Next-generation geothermal, however, involves the creation of artificial reservoirs. A crucial factor in ensuring the success of this innovation is the effective implementation and prioritization of this work within federal agencies.
While American companies are leading in AI development, it is important to note that there is no guarantee that the U.S. will dominate the rollout of next-generation AI computing infrastructure. To dive deeper into this discussion, we have Ben Della Roca, former director for technology and national security on Biden’s National Security Council, and Tim Fist, a director at IFP, alongside Arnab Dada, also director at IFP and managing director at Employee America.
Ben played a significant role in drafting the AI infrastructure executive order. Meanwhile, Arnab and Tim have recently published a comprehensive three-part series discussing data centers, AI, and the policy initiatives required to ensure that Artificial General Intelligence (AGI) is developed and implemented effectively in the contiguous United States. Welcome to China Talk, Ben, Arnab, and Tim. Thank you for joining us!
Ben, can you explain why a National Security Council director was spending time in SCIFs, engaging with energy permitting policy on federal lands?
Ben: That’s a great question! Spending so much time studying environmental permitting law and watching lectures on how the Clean Air Act operates was not exactly how I imagined my time as an NSC director would be spent. But there I was, working on my AI-related projects in the SCIF.
As you mentioned at the beginning, the U.S. currently has a strong lead in AI, owing to our robust innovation ecosystem and the remarkable engineering talent in this country driving significant AI advancements. However, this lead is not assured. The leadership in AI is increasingly determined by where AI can be built most rapidly and effectively.
When I refer to “built,” I’m not only discussing the technical engineering challenges of executing large-scale AI training runs but also the physical infrastructure needed. This includes the large-scale computing and energy infrastructure that these AI systems will rely on to carry out increasingly demanding training efforts.
To place this into perspective, we recognized during the last administration the substantial impact of AI across national security and various sectors of the economy. AI is being implemented more widely to advance scientific pursuits throughout all domains, accumulating a significant national security importance. Simultaneously, though, we observed a rapid increase in the computing power and energy resources necessary for developing frontier models.
Trends suggest that the computing power required to train cutting-edge AI models is rising by about four to five times annually. This is an exponential growth rate. Even when considering improvements in energy efficiency, we anticipate needing gigawatts of electricity to run frontier training efforts within the next few years, provided current scaling trends hold steady.
That essentially highlights the primary challenge we faced. It’s worth noting that there are multiple interconnected power-related challenges in developing and deploying AI technologies.
On one side, we have the demand for large-scale gigawatt training facilities required for training frontier models. On the flip side, we also need a robust network of potentially smaller distributed data centers across the country to optimally utilize AI tools.
Reflecting on my role at the National Security Council, I led the White House’s efforts around AI infrastructure, which encompassed developing the AI infrastructure executive order released in January 2025. This executive order primarily addresses the challenge of building those large gigawatt-scale training clusters.
Tim: So Ben, given the ambitious goals set out in the executive order, what high-level elements do you believe will have the most significant impact on constructing data centers in the U.S.?
Ben: The executive order incorporates a broad range of initiatives, addressing not only the establishment of large gigawatt-scale data centers but also fostering a distributed network of smaller data centers across the country. There are many components involved, but let me highlight a couple of key aspects:
First and foremost, the executive order establishes a framework to build AI data centers on federal sites owned by the Department of Defense (DOD) and the Department of Energy (DOE) in a more streamlined and efficient manner. There is a substantial value in utilizing DOD and DOE locations because this allows for bypassing many state and local land use permitting requirements, which can often complicate data center construction processes.
However, there are some additional responsibilities associated with federal land permits that we should address later. The federal government has the capacity and intent to expedite these processes as much as possible under the executive order.
Secondly, we need to focus on bringing power generation capabilities online. Operating gigawatt-scale data centers will require significant amounts of new power to be added to the electric grid. This task is complicated by numerous factors, including intricate permitting requirements, interconnection delays, and various approvals that must be secured.
The executive order instructs the Department of Energy to gather and disseminate information regarding unbuilt power projects that have already received interconnection approvals. This will expedite the power procurement process for data center developers. Additionally, the Department of Energy is directed to collaborate with utilities to reform their interconnection processes, promoting more efficient connections.
A major area also includes transmission. The executive order empowers the Department of Energy to leverage its capabilities and partner with private sector transmission developers to construct transmission lines more effectively and swiftly than conventional methods. Moreover, there will be additional actions taken to strengthen the supply chain for transmission and grid equipment, which will be crucial for the long-term viability of this industry.
Overall, the executive order lays out a clear pathway for constructing gigawatt-scale facilities on the timelines that leading developers will require for AI training.
Jordan: To summarize, we’re working with AWS, Amazon, Google, and Azure to construct massive data centers on federally owned lands while also facilitating the necessary electrical utilities to connect them once they secure the appropriate permits. Tim, Arnab, you both included a compelling chart in your recent post that highlighted the achievements of Ben’s efforts, but also identified several shortcomings of the executive order that must be addressed through additional policy moves to unleash these gigawatt-scale data centers.
So, what’s missing? What gaps do you see in Ben’s narrative?
Tim: The executive order established by Ben is an incredibly valuable foundation, but we have identified two major gaps that deserve attention. Let me discuss the first one. The executive order mandates that all energy used to power these data centers must come from clean energy sources, including natural gas with carbon capture. However, this technology requires significant development before meeting the necessary timelines.
Although the executive order sets ambitious timeframes—like bringing multi-gigawatt facilities online within the next two years—the clean energy requirement could compromise the speed at which these projects can be delivered. Our primary recommendation is allowing short-term use of natural gas plants during this transition to meet these timelines.
The second issue revolves around the NEPA (National Environmental Policy Act) implications. While locating on DOE and DOD lands may allow us to bypass many state and local permitting hurdles, it automatically triggers NEPA regulations.
Our suggestion here is to utilize the Defense Production Act (DPA), which can expedite permitting on federal land and help resolve supply chain challenges. Arnab can share more specifics regarding this, but we believe the DPA is a practical option to consider in this context.
For those who are unfamiliar, the DPA provides the president with extensive authority to intervene in the economy when necessary to secure the supply of technology deemed critical for national defense. We posit that AI falls within this scope because powerful AI systems are increasingly adopted by both the U.S. and Chinese militaries across various domains, including surveillance and autonomous weapons.
Furthermore, as the most advanced AI systems are being developed by private firms, the DOD’s future capabilities will likely rely heavily on models trained in those private data centers. Recently, OpenAI partnered with Anduril to deploy its models in military applications, while Scale AI has created a version of Meta’s Llama designed for military planning.
Arnab: Tim articulates this quite well. We believe the DPA serves as a suitable strategic tool to accelerate processes. Title I of the DPA prioritizes contractors focusing on providing essential items, such as transformers and turbines, specifically for AI data centers. This would help alleviate some of the associated supply chain challenges.
Additionally, Title III of the DPA offers financial assistance. This provision allows for diverse contractual arrangements tailored to technology needs, which is particularly advantageous for large-scale energy projects. There are also mechanisms within Title III that streamline permitting issues related to procedural regulations like NEPA and the Clean Water Act. This is a practical application we can consider.
Jordan: Let’s delve deeper into the Defense Production Act and examine its capabilities. What powers does it grant the U.S. government?
Arnab: A practical example would be the current situation with natural gas turbines, which, according to GE Vernova, are essentially sold out through 2030. If a company were to contract for these turbines, there’s no guarantee they’d arrive in time for the timelines established by the executive order signed by President Biden.
What the DPA would allow is for the president to declare this a national security priority, thereby ensuring that those contracts get fulfilled first. Specifically, this is referred to as “rated orders,” allowing the president to prioritize the fulfillment of contracts deemed essential for national security—whether that involves turbines, transformers, or any other equipment tied to transmission improvements for AI data centers.
Ben: Responding briefly to what’s been raised, I want to emphasize the national security necessity of this endeavor. I completely concur with the assessments regarding the implications for national security. Furthermore, it is essential that we take steps to avoid becoming reliant on foreign sources for infrastructure that supports operating these AI models.
While I can’t discuss the past administration’s deliberations in detail, I believe the prioritization of grid equipment is worth exploring further, especially with the DPA’s Title I provisions utilized previously by the Department of Defense.
I’m curious though about your thoughts on the environmental law preemption that you mentioned in your report. Sharing my personal view—not as legal counsel—there may be a legal question regarding whether direct conflicts exist between the DPA authorities and environmental permitting regulations.
One approach could involve using the “without regard to other laws” language included in certain Title III authorities, which has proven effective in similar contexts. That said, considerations regarding the financial arrangements tied to that language may introduce complications that warrant careful examination.
Arnab: Absolutely, the “without regard” clause is indeed a potent tool, but it only applies to lending provisions within Title III. Furthermore, we should recognize that our objective is to streamline procedural laws associated with environmental review rather than completely bypassing them.
Taking that into account, I believe there is a significant opportunity to minimize the likelihood of litigation, which can be particularly advantageous in national security exemptions.
Ben: Quickly adding to that, I share your viewpoint. The goal is to find a balance between expediting procedural aspects while ensuring valuable environmental reviews are carried out in good faith. On the subject of NEPA, it’s important to note that the recent adjustments to CEQ’s interpretation may open the door for broader applications in national security-related situations.
Jordan: I think we should step back for a moment. With the recent changes in NEPA regulations and the overall shift in policy direction, there may be pressing concerns for hyperscale companies as they navigate the return of more stringent regulations. Though, to your earlier point, if data centers are built during a more permissive regulatory environment, it’s questionable whether future administrations would shut them down.
What I found particularly intriguing in Tim and Arnab’s analysis is the emphasis on the hard constraints related to building the energy infrastructure necessary to develop these data centers in the first place.
So, as you think about energy options, what do you consider to be overrated or underrated sources of electricity in the context of future AI development?
Tim: Let’s talk about what may be overrated. My gut reaction is that natural gas is sometimes perceived as a simple solution. However, constructing gigawatt-scale energy projects—particularly in off-grid applications—is a massive investment, regardless of financial resources against such infrastructure.
While natural gas is recognized as a reliable technology, it carries risks related to stranded assets—meaning that if an AI data center outlives its site usefulness, transitioning to more cost-effective options may be difficult.
Arnab: I think fusion energy is another technology that often receives a disproportionate amount of optimism. It’s fascinating that some hyperscalers are signing power purchase agreements for fusion energy, yet commercial fusion reactors have yet to be practically demonstrated. For our immediate concerns, this technology is too far off to be a relevant solution—especially given our five-year timeline for building necessary infrastructure.
Ben: I want to reinforce that every energy option comes with its own risks and challenges. For example, while nuclear power shows promise for the future, I also contend that its viability for meeting AI energy needs in the late 2020s should be approached with caution. Time frames matter significantly when assessing potential sources.
Tim: Certainly, we find ourselves in a complex scenario today where numerous technologies are converging toward cost effectiveness. Technologies such as large-scale battery storage are becoming competitive with natural gas, while advanced geothermal options are progressing, albeit at a slower pace.
Overall, while the near-term outlook may favor natural gas as a primary solution, it’s essential to recognize its potential obsolescence over the next decade, adding another layer of complexity to our planning.
Arnab: On an optimistic note, I am extremely excited about the prospects of next-generation geothermal energy. This technology harnesses energy from the Earth’s crust, capturing the heat beneath our feet. Our innovations in drilling techniques—thanks to experiences from the fracking sector—position us uniquely to advance this energy solution. Ben: There hasn’t been enough demonstration in this area yet. Some companies are innovating quickly, but the scale is really remarkable, and it’s a space where the U.S. can truly lead. We have an oil and gas workforce where 61% of the workers possess transferable skills that can be applied to geothermal energy. Additionally, we have a supply chain from fracking and shale production that is ready to pivot to next-gen geothermal if we can navigate the transition.
I believe the potential here is incredibly high, and I wish we were doing more to support it. Can we stay on the technology for a moment? How exactly does one drill a hole and get electricity out of it? Essentially, you’re drilling into the Earth’s crust, where there’s significant heat, and then you’re pumping fluids down to that heat source. The fluid gets heated up, and you circulate it back to a steam turbine, which is then used to generate electricity.
That’s a simple explanation, but it’s akin to a steam boiler using the Earth’s core as its power source. That’s pretty incredible. There are various types of geothermal systems. For instance, enhanced geothermal systems (EGS) involve a fracking process to create natural reservoirs.
Traditional geothermal energy relies on three components: heat, fluid, and a reservoir. You benefit from these natural geothermal reservoirs, while next-generation geothermal works by creating artificial reservoirs. By drilling and fracking to create fractures in the crust, you’re able to cycle fluid through it. Rest assured, this process has been tested and proven safe; it’s not something that’s going to blow up the Earth.
Ben: To highlight some exciting statistics: the amount of heat energy stored in the Earth’s crust accessible through enhanced geothermal energy far exceeds all known fossil fuels combined. This represents an abundant and low-carbon energy source without the intermittency problems associated with solar and wind. Additionally, we can leverage many of the tools we’ve developed for large-scale fracking.
Currently, however, we face a scaling challenge. How do we expand from tens of megawatts currently operating in the U.S. to hundreds of megawatts? Fortunately, the areas with the highest heat potential in the Earth’s crust largely overlap with federal lands that can be easily leased, creating a favorable situation for solving this problem.
Tim: Just to add on, for every 10 megawatts of energy produced using a method known as a triplet, which involves drilling three wells, we would need to drill 500 triplets to achieve our target of 5 gigawatts by 2030. That essentially translates to drilling 1,500 wells. We’ve done that multiple times in the shale sector within a single year, so it’s really not an insurmountable task if we manage it correctly.
Ben: I’d like to underscore that geothermal energy is the single energy source I’m most excited about, especially in terms of how it’s underrated by the general public. I also believe AI can serve as a significant demand source that could draw capital into the geothermal industry, driving development and the technical advancements necessary for the U.S. to become a global leader in this technology and enhance our energy leadership overall.
Tim and Arnab did a commendable job showcasing the many reasons this technology holds promise. The executive order has already initiated steps that have the potential to make a significant impact on this field, underscoring how critical policy decisions will be for the timeline of bringing geothermal technology online.
As Tim explained, the regions where traditional geothermal resources exist are primarily in the western United States, which aligns closely with areas that have substantial Bureau of Land Management lands. One of the challenges we’ve faced in developing geothermal projects on federal land has been the lengthy federal environmental permitting reviews.
The executive order has instructed the Department of the Interior to streamline these reviews. This includes eliminating redundancies in review processes for geothermal projects and creating designated “priority geothermal zones.” Here, the Department will focus its efforts to expeditiously advance permitting.
Ideally, these priority zones should coincide with locations where AI data centers are being established, ensuring that we’re moving in a coordinated direction. While there’s still much work to be done, we have a solid foundation to rapidly accelerate development in the geothermal sector.
Jordan: What about environmental considerations? Are there even endangered species at depths of 20,000 feet? Doesn’t it seem like just a handful of workers are at risk here?
Ben: It’s a great question. The environmental impacts here are indeed much less severe than those associated with traditional fracking in the oil and gas industry. While any construction project requires building infrastructure, which can affect the surrounding natural environment, the overall impact of deep drilling is lower. However, if endangered species are present in an area designated for construction, that becomes a factor in environmental assessments.
Ben: There are effects stemming from deep drilling that could influence surrounding regions, but the environmental challenges are significantly less pronounced. This is why, generally speaking, the permitting process can progress more quickly. It’s all about allocating the right policy resources to move as swiftly as we can, considering the relative environmental risks posed by this technology.
Arnab: Lessons from the shale revolution could provide valuable insights for incentivizing the development and production of geothermal energy. During the 1970s, after the Arab oil crisis, there was a focused effort to support nonconventional energy production. By the mid-2010s, the U.S. had become a leading producer of oil and natural gas.
There were four key policy interventions that played a vital role. First, the government funded research and development programs to innovate drilling techniques and improve production methods. The Department of Energy collaborated directly with companies like Mitchell Energy to share the costs associated with drilling and testing new production methods.
Secondly, there were various supply-side production tax incentives as well as demand-side price supports. The Section 29 tax credit was an example of an incentive for nonconventional production sources. Nowadays, the Inflation Reduction Act provides similar support for new energy types. These tax credits need to be maintained.
The third aspect involved changes in permitting practices. The Energy Policy Act of 2005 introduced legislative categorical exclusions enabling certain types of production to go through reduced NEPA analysis, accelerating approvals.
Lastly, we often overlook the broader macroeconomic environment, which was highly favorable through low-interest rates during the shale boom, allowing easy access to funding for new projects.
These interventions significantly contributed to the shale revolution. We need to explore how to shorten timelines for next-gen geothermal projects to achieve similar success.
Jordan: How far are we from reaching the point where these steam boiler-like technologies can be fully operational?
Arnab: Fervo is currently constructing a 400-megawatt facility, and while I’m unsure of its exact status, it’s a private company relying on public information that has demonstrated the feasibility of its technology at a smaller scale. Tim mentioned a Google facility, which powers a data center with around 40 megawatts of energy. It’s really about scaling up.
The main challenge for these companies lies in securing the capital needed to validate that their technology can deliver energy at a utility scale.
Jordan: Regarding financing, where is the capital for these developments currently coming from, and where should it ideally be generated to realize the geothermal vision over the next five years?
Tim: The challenge lies in financing next-generation technologies, as there is a significant level of uncertainty associated with their development. Investors must be comfortable with potential cost overruns stemming from delayed permits or supply chain issues. Higher interest rates further complicate investment decisions.
This accumulated uncertainty deters equity investors since only a handful of venture capital firms focus on this risky sector, and they quickly reach their capacity. Banks are similarly hesitant due to the high likelihood of failure associated with uncertainty.
The government does play a role in financing this development. For instance, the Office of Clean Energy Demonstrations has funded small modular reactor projects. We need a similar government investment model for geothermal developments where the federal government can share costs, as it did with Mitchell Energy in the shale sector.
Additionally, the government must reduce uncertainty. When discussing technological demonstrations, it’s critical to gain investor confidence at each development stage to move toward larger-scale projects. Financing typically hinges on three components: debt, equity, and offtake agreements where someone agrees to purchase the energy produced.
Currently, many AI companies are investing in energy through power purchase agreements, but this sector faces substantial uncertainty. Key uncertainties stem from regulatory timelines and material bottlenecks, which are difficult to quantify.
It’s essential for one of these stakeholders to take ownership of that uncertainty. It’s unlikely to be the offtake agreements. High premium agreements would be necessary to make debt or equity feel secure about their investments not failing.
If the federal government can shoulder some of that uncertainty—through initial capital investment or streamlined regulatory processes—it could unlock the funding these companies need for upfront investment. Right now, there isn’t a strong capital source ready to support these projects, and that’s the barrier we need to overcome.
While many tech companies hold significant cash reserves, they’re not channeling it into energy investments. However, Amazon has made a direct investment in an SMR project in the Pacific Northwest, which is a model we would like to see replicated on a larger scale.
Jordan: That’s interesting! You’ve introduced some great acronyms—FOAC, SOAC, THOAC, and NOAC for projects of different kinds. We’re clearly still in the early stages.
Jordan: Let’s take a moment to push back against the wave of hype. Looking ahead three years, we may still be dealing with relatively small demonstration projects. While I recognize the importance of perfecting the methods for drilling and system operation, I wonder if this will actually progress sufficiently forward, especially given the rapid timeline for AGI development.
Tim: I think this highlights the validity of a comprehensive energy approach. We don’t want to stake everything on geothermal. Staging different technologies as they become viable would be prudent. Initially, we could deploy natural gas plants, which can quickly come online to provide reliable energy.
We can also invest in promising solar and battery storage projects while developing geothermal. Small modular reactors will follow, and fusion could come into play in 20 years. The key is to diversify and invest in multiple technologies simultaneously to mitigate technological risk in next-generation programs.
In the geothermal field, we could implement a layered approach, relying more heavily on certain geothermal technologies while gradually transitioning to others. Traditional geothermal technologies are generally well-established and less experimental.
While they may not contribute vast amounts of gigawatts immediately, they can still play an essential role by 2028, especially if we encourage resource exploration as data centers are being developed. Enhanced geothermal methods, which are still in their early testing phases, could realistically come online beyond 2028.
Tim: Each energy source has its strengths and limitations, and it’s unrealistic to expect any one solution to fulfill AI’s entire energy needs. Combining various approaches, particularly solar and batteries, can provide robust energy supply options. While scaling presents challenges, solar plants are quicker to build, and efforts are underway to assess environmental impacts for solar development in certain western management areas, which could further expedite projects.
Jordan: Can we explore the topic of transmission lines and transformers? You mentioned earlier that some developments may end up being off-grid, with companies like Google building their power sources near new data centers. How significant is it to have transmission lines in the mix?
Ben: Transmission is indeed a vital part of the overall solution. In theory, it would be ideal to have all power resources co-located, eliminating the need for transport. However, it’s highly unlikely that we’ll be able to find sites for gigawatt-scale data centers that don’t require transmission lines.
At a minimum, transmission lines offer numerous benefits by expanding access to diverse energy resources. If we position data centers near various energy sources, we can capitalize on them even more. Moreover, interconnecting power generated onsite to the larger electric grid enhances stability, simplifies logistics, and reduces financial risks—enabling companies to resell excess power.
Ben: Furthermore, I want to highlight specific elements from the executive order designed to bolster transmission infrastructure for AI data centers. The Department of Energy possesses significant authority to tackle these challenges. One known power is establishing Natural Interest Electric Transmission Corridors, allowing the DOE to expedite permits that are languishing and hindering development.
While establishing a NITSI can take time, the DOE also has less-publicized authorities that should be explored. They can partner with developers to create and enhance transmission lines. Certain statutes empower the DOE to establish public-private partnerships aimed at constructing transmission infrastructure.
Analysis from the DOE suggests that these authorities might help bypass lengthy state approval processes, enhancing cost allocation efficiency and expediting projects typically bogged down by state utility commissions. Utilizing these authorities can significantly speed up the development of shorter transmission pathways connecting data centers to the grid.
Arnav: I completely agree with everything Ben just said. Transmission is critical, and I believe the short-term solutions he highlighted offer promise. The shift toward off-grid solutions often arises because of the difficulties associated with existing transmission infrastructures, indicating the need for long-term reform and permitting changes from Congress. Many of the necessary adjustments can’t be effectively addressed solely through the executive branch. Ben: Some stats on transmission are shocking. It takes an average of 10 years to build a new transmission line in the U.S., largely due to permitting holdups. Ten years ago, we built 4,000 new miles of transmission lines annually. Now it’s about 500 miles. This is a factor of 20 decline.
Tim: This is a hard problem. Ben, what’s a transformer? Why does it matter?
Ben: Conceptually, a transformer transforms the voltage of an electric current. This transformation process is essential to deliver electricity from transmission lines or higher voltage environments to voltages that an end-user facility can accept. We need transformers within electric infrastructure to bring it to the uses we ultimately want to power.
Tim: The challenge with the transformer industry, though, is that there’s limited capacity. At least with current resources being allocated to transformer development, we cannot increase or supply an adequate amount of transformers to meet the demand of powering all this infrastructure that AI is demanding.
Ben: I think that there’s a lot that can be done to support the transformer industry—like loan guarantees or other financing options that can be encouraged by the government. This support would allow the transformer industry to invest in the capital expenditures needed to expand their facilities or to train a new workforce and add new workers to existing facilities.
Tim: All these things are going to be important for bringing down transformer lead times, which, I believe, are currently in the two and a half to three year range. But there’s certainly an important part of the supply chain aspect of this problem.
Ben: Since we were talking about the Defense Production Act earlier, now that corruption is in and FARA is dead, how far can a president, who really just wants to let their main consigliere—the one building giant AI administrators—get all the gas turbines before everyone else? Is there any recourse for something crazy like that?
Tim: Look, I think broadly with DPA use, it’s important when you’re utilizing any legal authority that’s pretty aggressive. I think it’s an assertive use of legal authority. It’s good to try to get the political buy-in, too. Even if you believe that the legal authority is bulletproof and you can do what you want, you still want political consensus.
Ben: I think, at least, the fact that I’m here and talking about how the Biden administration prioritized this because they perceived it as a threat is significant. The Trump administration also recognizes this as a threat and suggests that there should be a national security aspect to the AI data center build-out. There seems to be some consensus.
Tim: One thing I would say, though, is that the DPA is up for reauthorization. This typically happens in a bipartisan fashion. If you want to take advantage of that consensus, you can appropriate a specific amount of dollars and authorities. If you want to add to it even more for DPA to assist with our energy infrastructure build-out for AI data centers, you could put safeguards around it.
Ben: If Democrats are truly concerned about the hypothetical you raised, it can serve as a negotiating chip for a bipartisan reauthorization.
Tim: Also, there were some things that happened in the Biden administration regarding the use of DPA for heat pumps that Republicans didn’t favor. There were hearings last year on this. It emphasizes the importance of building some kind of legislative consensus here.
Ben: Arnav, you’ve made great points. One aspect that shouldn’t be overlooked is the role of litigation as a check in this process. This dynamic has long occurred with infrastructure projects across various industries.
Tim: Exactly. The way litigation works around the National Environmental Policy Act and other permitting-related statutes can lead to injunctions that ultimately delay projects while court proceedings are ongoing.
Ben: This highlights the significance of ensuring that all the T’s are crossed and I’s are dotted when pursuing options that utilize national security authorities. If you step outside the bounds of the law, the odds of litigation increase.
Tim: That can ultimately result in delays for projects. If you can’t get past the pre-construction stage due to extensive litigation, it can detrimentally impact your ability to advance artificial intelligence. Time is really of the essence in this space.
Ben: So, we must ensure compliance with laws for a variety of reasons. I think a lot of the dynamics you just pointed out also apply to all the NEPA matters discussed earlier.
Tim: Absolutely. Ben was making efforts to ease the process for firms, while then Trump reversed many of those changes, which is legal—we’ll find out. On one hand, it might be exciting for companies like Google and Amazon, but on the other, it opens them up to a new legal attack surface.
Ben: Any additional thoughts on that?
Tim: I just want to quickly add that your point about uncertainty is spot on. Trump’s rollback of NEPA could speed things along, but it doesn’t eliminate the necessary statutory requirements for agencies to review the environmental consequences of their actions.
Ben: Right, and without proper regulation, there’s going to be a lot of ambiguity and uncertainty about what that means. Courts may look to years of past practice to determine what exactly the statutory requirements entail.
Tim: I want to point out—CEQ, the NEPA regulation passed by the Council on Environmental Quality, was rescinded. However, every agency still retains its own rules for conducting a NEPA analysis, which remains in effect.
Ben: So while there is long-term uncertainty, there is still a regulatory framework existing for agencies to follow, which is crucial for compliance.
Tim: Alright, Tim. We’ve discussed all the support we’re giving these hyperscalers, from funding their energy sources to easing environmental controls. But I think you wanted to introduce a hook to get them to take AI security more seriously.
Ben: What’s the market failure here? What sorts of requirements do you think the government should impose to ensure these special dispensations?
Tim: Great question! One core claim we advance in our report—and that is represented in the previous administration’s executive order—is the market failure surrounding AI security.
Ben: So what does that mean?
Tim: As many guests on your podcast have mentioned, U.S. companies and AI firms are building models that may reshape global economic and military power. Think of AI systems capable of massive cyber attacks or automating scientific R&D.
Ben: If that’s the case, we need to protect these systems from theft.
Tim: Exactly! Protecting against sophisticated threats, like nation-state hacking groups, is challenging and costly. If a company invests adequately in security, they risk falling behind competitors who aren’t investing as much.
Ben: So, how do you tie this assistance around loans and permits to security requirements?
Tim: The idea is to create a set of security requirements for labs that hyperscalers and AI firms must follow. This raises the security level for protecting critical IP against threats. Tying assistance to these requirements transforms it from a disadvantage into a sound commercial decision.
Ben: That’s interesting!
Tim: We have outlined several ideas for what this could look like, such as applying existing standards across the board or creating new standards specific to the threat model for attacks on AI model weights.
Ben: This could involve substantial government assistance for supply chain security, physical security for AI accelerators, personnel background screenings, and counterintelligence playbooks.
Tim: Precisely! The aim is to form a strategic partnership between the government and the AI industry to enhance security while providing incentives for compliance.
Ben: The potential future you described, where the U.S. excels and accelerates faster than others, raises questions about the security challenges these AI systems could face.
Tim: Right! There’s a scenario where if we dominate AI technology, the incentives for adversaries may become more pronounced.
Ben: What if we encounter a physical attack that fries servers?
Tim: The physical security aspect of the vast investments in data centers is a critical focus area for both the U.S. government and companies.
Ben: I share your stance on protecting model weights. However, I am cautiously optimistic about their future. In the coming years, I believe powerful models will initially be deployed internally for several reasons.
Tim: Absolutely! As capabilities grow, labs will want to address misuse concerns before widespread deployment.
Ben: It makes technical and economic sense to deploy internally first. And it’s beneficial to train large models before distilling them down for broader use.
Tim: Yes, this practice is now common in many Frontier Labs. Protecting cutting-edge models can occur within a more favorable security environment before they are released for general use.
Ben: True. However, I think we should still consider the risk of models being stolen eventually. Protecting the absolute cutting edge is essential as it helps maintain our lead.
Tim: Absolutely! It’s crucial to prevent the free transfer of powerful models, as allowing this could bolster another country’s AI research.
Ben: Plus, we must prevent denial or sabotage operations. I’ve seen research highlighting how current AI data centers could be vulnerable to attacks.
Tim: I agree! Investing in defenses against such potential threats is essential.
Ben: Ultimately, protecting our AI talent is also pivotal. If we impose strict checks on every AI PhD graduate, we’ll struggle to maintain a robust ecosystem.
Tim: Exactly! So, financing energy projects typically requires three components: debt, equity, and offtake agreements—meaning someone has to agree to purchase the electricity once it’s produced.
Ben: Right. Currently, many headlines claim AI companies are investing in energy projects, but mostly through power purchase agreements.
Tim: That’s correct! There’s immense uncertainty in next-generation energy due to factors like permitting timelines, regulatory processes, and material feasibility.
Ben: It’s vital for one of those three participants to own that uncertainty.
Tim: Exactly! You’d need a very high premium on offtake agreements to cover the uncertainties that would reassure debt or equity investors.
Ben: It sounds like we propose that the federal government can either own the uncertainty by providing upfront capital or work to streamline regulatory procedures.
Tim: Yes! This could unlock significant capital, encouraging companies to invest directly in projects.
Ben: Seems like some tech companies, despite having substantial cash reserves, aren’t engaging in upfront energy investments.
Tim: True! Although Amazon has made a direct investment in an SMR project in the Pacific Northwest, we aim to see more of this.
Ben: I have a question; there was a humorous chart you and Arnav had, showing Google, Amazon, and Microsoft committing to net zero by 2030. Their trend line goes the wrong way once they realize they need to invest billions in data centers.
Tim: That’s a great observation! Do those commitments stand firm in our current climate?
Ben: It’s essential for policymakers to figure out how to ensure emissions-free technologies become cost-competitive and quick to deploy.
Tim: Right! Our job is to make decarbonization happen as fast as possible.
Ben: I agree that making energy sources affordable is one key to ensure adoption. But it’s also crucial that the permitting timelines are efficient.
Tim: Exactly! If clean energy technologies can come online more quickly than less clean sources, it incentivizes industries to choose the faster permitting route.
Ben: There’s been considerable work in the last administration, aligning actions with AI’s energy needs.
Tim: Absolutely! Effective implementation and prioritization within federal agencies are linchpins to success.
Ben: Ensuring timely and effective progress is extremely important. In your paper, you recommended appointing an AI infrastructure czar to spearhead this complex effort, blending national security, energy policy, and permitting law.
Tim: Strong leadership from the White House and federal government is necessary to make the envisioned goals a reality. Ben: So, I think that’s one simple but important point to underscore: the implementation side of this really matters.
Arnab: Absolutely, Ben. Without effective implementation, even the best policies can fall short.
Tim: Right, it’s all about ensuring that the strategies we develop translate into real-world impacts.
Ben: Thank you both for being a part of this discussion. It’s crucial we continue having these conversations as we navigate this complex landscape.
[Placeholder for a group photo of Ben, Arnab, and Tim at the discussion in Chinatown]