Crusoe Energy secures $15B financing for 1.2 GW Abilene data center campus in partnership with Blue Owl

May 29, 2025 · Full transcript · This transcript is auto-generated and may contain errors.

Featuring Chase Lochmiller

CEOs and and uh research scientists. Lab on lab violence. Lab on lab violence. Anyway, uh speaking of the man who powers it all, we have Chase Lock Miller from Crusoe Energy in the studio. How you doing? Welcome. Welcome to the stream. We are missing sound. Are are are you muted? Are we muted? What's going on?

Let's check it out and get to the bottom of this. And he builds. We can hear you now. Makes data centers. He does. He's not a a Zoom expert. Give him a break. What is new with you? How are things going? Give us the latest on all the news that came out.

I feel like the last two weeks have just been Crusoe wall-to-wall coverage. Um, but how are things going in your world? Uh, things are good. Things are busy. Uh turns out uh you know AI needs a lot of power and uh all these chips need a lot of data center capacity.

So um what we announced last week was uh you know the completion of funding for um the expansion of uh our our facility in Abalene, Texas. Um that's going to consume a total of 1. 2 gawatts of of total power capacity. Uh and and that funding uh we we did in partnership with uh Blue Owl Capital.

Um, so the total funding is is about 15 billion in in total capacity uh to build out. That's amazing. We love to see it. What how how is it different working with uh an investor, a financial institution like Blue versus some of the investors that you've worked with on the venture side?

Is it more Excel and less vibes and decks? Uh is it is it a wildly different underwriting scenario? like, you know, we're we're more familiar with the venture style where it's usually just a a handshake and a term sheet on the back of a napkin, a prayer. Yeah.

I feel like I left the the vibes investors, you know, behind, you know, a long time ago. That was more like kind of the earlier stage stuff, but uh so, you know, certainly certainly different.

And I think with Blue, they've been a great partner, you know, one of the leading uh real estate uh you know, private equity uh practices in in the world. Um so very sophisticated. I mean, they've been super supportive and helpful across getting uh the entire deal structured and um and and financed.

Uh so and then obviously very very deep deep uh pocketed uh capital providers to really help us you know make these projects happen at really significant scale. Um, now what's different is like this is not an investment in Cruso, you know, equity, right?

This is a this is a partnership with Blue Owl for this specific project. Got it. And just, you know, Cruso has a business model that is uh, you know, not asset light is very capex intensive. Yeah.

Uh, which requires, you know, uh, being able to tap into those very large pools of capital uh, to basically make these large scale AI factories happen. So what so so uh really break down the the anatomy of like how these deals work.

Is it like there's a new LLC or a new CC corp that's that that's created and then is there something that looks like a mortgage with like a 30-year payback period interest only period and then I imagine that this facility is going to make no money for a few years while you're building it out but then it'll start making a bunch of money and so is there some sort of repayment schedule that's responsive to that dynamic?

Yeah.

So um the breakdown of the structuring is uh so this also came out there was a uh you know the exact number is yeah I'm sure you can't share but uh the uh there is uh construction financing that's being provided by JP Morgan um so on the expansion it's a little over 7 billion um that's provided by JP Morgan and then a number of other banks and and and capital providers in the syndicate including Bank of America um and a handful of others.

Uh but uh the the way it works, I mean it is uh it is a a propco right so it's a property company that basically you know it's an LLC that ends up owning the um the individual uh you know uh campus or theualual buildings.

Um all of those buildings have an affiliated lease with them um with with with our customer um which is a long-term lease agreement. uh you know being able to partner with a large scale investment grade customer is really what helps unlock a lot of the capital here.

um you know when when people are talking about these very very large quantum of capital credit is like the magic unlocked right so so so when you have big long-term offtake agreements with high credit quality customers um that's where you can really unlock these larger pools of capital and you know we can put $15 billion to work uh in a uh in a positive capacity and then so uh so that's happening over there how do you make money then sure we we make money as uh both a a project developer um as well as uh we we are a partner with Blue All and the ownership of the entity.

Makes sense. Our customer, you know, it's it's it's basically like we're the landlord, right? We have a customer that is paying us a a monthly rent for uh 15 years and, you know, we make money uh that's in excess of, you know, sort of our debt service obligation. Yeah. Then talk to me about the energy side.

That's kind of the bread and butter like the history of Crusoe as I understand it. Uh how important is it to find um unique kind of combinations of resources to provide energy to these largecale data center projects? Um what's happening in Abalene? What's unique about Abene?

And then uh and then I want to dive into a little bit more about the life cycle of that energy production plan. Yeah.

So uh so much of the bottleneck of scaling AI has boiled down to just lack of energy or lack of access to energy and uh you know Crusoe from its founding seven years ago has always taken this energy first approach to building computing infrastructure and instead of thinking about you know how do I build the next data center in Northern Virginia we've always kind of thought about like where can we access lowcost you know clean as much as we can uh and abundant energy uh to to power computing infrastructure and uh sort of the revolution you see unfolding with with AI and sort of this this complete transformation of the digital infrastructure landscape um is uh is is pretty mind-boggling when you when you really think about it because Northern Virginia is sort of like the center of the world for data centers.

Everybody's like okay the Northern Virginia corridor that's sort of where the internet is happening that's probably where like you know this Zoom conference is being like hosted. Yeah. Uh, you know, just so much of the internet happens in Northern Virginia. AWS US East. We know and love it. Yeah. Exactly.

Everybody backbone of of our industry. Yeah. Totally. Exactly. So, all of the data center capacity we've ever built in Northern Virginia Yeah. is about four and a half gigawatts. Wow. What we're doing in Abalene, Texas is 1. 2 gigawatts. Wow. And you know, we're looking at trying to do more. Yeah. We're one company.

This is for one customer. We're looking at other sites that are 5 g, right? So, you're talking about building a whole Northern Virginia that's been built over the last three decades. Yeah. One facility for one customer, right?

Uh, you know, there's just fundamentally not enough power there in Northern Virginia to make that happen. And so, you know, what's happening with AI is like you're seeing everybody start to take an energy first approach to uh developing this infrastructure. And that's really what led us to Abalene, right?

Abene is a market where um you know there's an abundance of energy. A lot of uh wind particularly uh and solar had been built on the back of production tax credit incentives. And their problem was actually they didn't have enough demand for energy, right?

They would frequently get curtailed, which means like they would have to sell power at a negative price. So they're shutting down their wind farm. Um or pricing would go negative uh and they would sell at a negative price. Um, so their issue was actually just not enough demand for power.

So it was a it was a good natural fit between uh AI uh factories and uh uh you know lowcost clean abundant energy. Um you know it's a it's a pretty awesome you know setup. We're going to account for about I think the number is about 30% of the total tax revenue for Abalene just like we're like project that's amazing.

Can you give me like kind of an energy 101 on on energy production in Northern Virginia between uh I'm sure natural gas is in there, there's solar, there's wind. I don't know if there's any nuclear. Are we still using coal at all? Like I really have no idea. Yeah. Yeah.

What about like crude oil or fuel or like just gasoline? Does that power AWS at all? Like I'm just curious about that mix. Yeah. No, there is actually uh you know, especially during moments of peak demand. If you if you have peak demand at night, diesel generators, extremely cold night, there's obviously no solar. Sure.

And you know, people are, you know, oftentimes just having to use uh oil. Um, you know, burn oil to produce power. Uh, which is like not a good uh, you know, uh, not a good uh, uh, yeah, it's just it's like the dirtiest possible option here. Like we could be a lot cleaner. It's expensive. And it's expensive. Yeah.

Um and then contrast that with Abalene. What's the energy mix look like there? Is it similar or is there something different? You mentioned wind. Is there more wind in Abalene than Yeah, I mean Abene is one of the windiest places in the United States.

Sort of this corridor that just gets a tremendous amount of wind and that's why a lot of uh renewable energy developers built there. Sure. Uh that you know I I think it's actually important to understand this production tax credit.

Um so the way this works is that uh the independent power producers that build these renewable energy uh facilities um they get paid a production tax credit for producing and selling a kilowatt hour of clean power. Mhm. Now they get paid that regardless of who they sell it to and at what price.

And so that has led to these consequences where um you know you'll often see power prices go negative because their actual realized price is you know after you factor in the production tax credit subsidy they're getting is positive but it's like kind of they're having to pay someone to take that kilowatt hour and then they go collect that that that subsidy through the production tax credits.

Mhm.

Um so now the issue becomes those production tax credits only exist for 10 years and so at the end of the 10 years you still have this working wind farm and you're like okay power prices go negative now I have to curtail and I have to shut off which means I could be producing power but I'm not because there's literally no marginal demand for the power and you know I think this is where you know having this alignment of you know markets where you can produce power in a very cost-effective capacity um and you can actually build an AI data center there uh to soak up that energy uh is a very good alignment that you know Cruso's tried to uh facilitate.

Can you talk a little bit about the history of the company? I know that at one point there was uh some some crypto mining with gas flaring stuff going on and then the transition from that into the current AI boom.

Was that a major emotional roller coaster or did it kind of overlap in a perfect way where it was just like all growth? Um yeah totally.

So uh you know we started the company one of the first applications of energy was uh uh was was as you as you talked about was basically capturing waste methane from oil production that would otherwise be flared. Okay.

And then utilizing that to power initially Bitcoin mining data centers but you know we also powered uh our early versions of our AI data centers. Sure. Like smaller training runs, right? Yep. Um and this was like pre-hat GPT.

This was like, you know, we were working with like, you know, the the MIT department of physics, uh, doing, uh, you know, early simulations of the big bang, you know, and, uh, you know, CCL department at MIT, like, you know, early early work in our AI cloud development, but ultimately, you know, when I started the company, I really wanted to build an AI cloud platform that was like uh, so a lot of people are like, wow, this was a great pivot.

And I always tell them like, no, this was like the plan from day one, believe it or not. Sure. Um and uh you know but I always felt like energy was the thing that tied together all computing infrastructure and any computing uh application when really scaled out energy does become the bottleneck.

Uh and I had seen that and experienced that in these proofof work blockchains like like Bitcoin and Ethereum. Um but uh and I felt like if if AO is going to scale you know energy would be a massive component in the overall operating cost of uh operating intelligence systems at scale.

And uh so uh we did a lot of early investment in in terms of like making the platform uh you know work with with Crucial Cloud uh and and trying to you know figure out what what we wanted to build and for who.

Um and then you know we kind of you know with the with the launch of chat GPT it really sort of catalyzed just massive investment and attention to you know purpose-built GPU infrastructure and we had done that from the ground up all the way from energy data centers um as well as uh uh managed infrastructure as a service um at the software layer.

Did you lose any sleep over the deepseek uh news or were you Jeban's paradox pill from day one and you knew that it was just gonna keep going? Um, yeah. I I I think just like the way that got spun up in the media was like so uh mystery, very pro-China media spin like very early on.

Like I think anybody everybody was like wait like there's no chance this was like a couple of hobbyists that had like a couple of GPUs in their garage and they train this model off of like that's just like not what happened. They did this training run with scraps. Scraps and a powered with a bicycle. They Yeah.

Yeah, totally. Um, yeah, it was a couple couple guys with pens and paper just uh but I guess I I I guess like the bigger question is uh as it does feel like we're somewhat shifting from a pre-training to an RL environment.

The scaling laws are holding in the macro, but it seems like there's a series of scurves in terms of the different training and improvement paradigms that lead to just better products. And so yes, we can we can do another 10x increase in pre-training, but maybe we hit a data wall or there's some problems there.

Um, what are you seeing in terms of tradeoffs for demand on the data center side as we go through these paradigm shifts in terms of what's important to create a really really performant AI product? Is it just we're shifting from training to inference and that doesn't even affect you? Does it affect you?

Do you need a different different buildout for for a large training run versus just mass inference of of complex models consistently forever all the time because demand's so high but it's smaller models all over the place like it does any of that affect the way we build data centers?

I think it does affect some of the ways that we build data centers.

Um but to the question of slowing demand I mean the conversations that I'm involved in there's like if anything we're seeing demand accelerate um and for you know bigger you know bigger you know larger scale clusters uh and uh and and just the overall demand is is increasing quite a bit for for for inference as well.

Um, and I think I think you see kind of this uh uh transition where you know folks will use a very large AI factory for a training run that you know gets some state-of-the-art model um you know that infrastructure is still useful for you know a long period of time uh to serve uh you know both inference workloads as well as you know any of these like post-training uh test time compute scaling you know chain of thought reasoning models um that you know are are basically uh you know taking inference queries and then thinking about them and sort of playing out a whole bunch of different scenarios um and then coming up with better smarter uh more intelligent answers but uh you know and I think that's like the crux of you know the infrastructure it's like what we're building are these AI factories right so they're factories that manufacture intelligence they're factories that manufacture intelligent outcomes that um you know are are prompted by you know input from users and um I don't see any near-term shortage of demand for you know more intelligence.

uh what are you bullish or bearish on sort of upstart or SMB players that that want to build AI factories or data centers that uh you know see the broader opportunity and then are maybe you know putting together sometimes sophisticated teams sometimes less sophisticated teams but able to pull together capital and you know want to bring data centers online and just assume there's going to be demand waiting there or just assume that they can actually you know build something that's state state of the art, pretty bearish.

Uh, uh, we've seen a massive influx of, you know, we call them like two guys in a pickup truck where, you know, I got my cousin Lenny who has like this plot of land, you know, out by his ranch and, you know, there's a power line that goes through it and he knows someone that works at the power, you know, just like just put up a barn and throw some racks in there.

We're in business. Hit up a hype. Hey, hey, I'm a Google shareholder. I'm going to just call up uh Sundar Sundar. I got a contract. No problem. Yeah, I got I got some 3090s in here. I got I got a I got a couple 1080 Ti right over there. Couple propane tanks. Yeah, propane barbecuing GPUs.

Yeah, I I think the I think the thing is that these these projects uh you know they're they're so big um that like the capital investment to you know make them happen is so massive and you're seeing people speculatively build smaller scale stuff but the you know the really big stuff that's you know whatever couple hundred megawws or you know gigawatt plus you really need credit to make it happen like you know I said it before but credit is the unlock to all of this infrastructure getting built and we have companies with the greatest balance sheets in the history of business that are going allin on this uh you know technological uh paradigm shift underway and with that you can unlock a lot of you know infrastructure capital to make all of this happen.

Um, but you know, if you're going to speculatively, you know, spend, you know, 20 million bucks kind of trying to build an AI factory, you know, it's like shooting a BB gun at a grizzly bear, you know, it's like not uh, you know, you're not you're not you're not Well, yeah.

at the at the $20 million scale, you can get a a group of smart people that can make a deck and then investors are going to see that and be like, I want to make money on this AI thing and be like, well, yeah, we'll throw 20, 30, 40.

Chase has the best animal-based metaphors because I remember we were at that nuclear conference and you said that the demand for energy is so high that companies would burn whale oil if they could. And for some reason, you keep coming back these but the BB gun at the grizzly bear is great.

where um so I'm sure you work with hundreds of different vendors for different you know components parts etc.

uh where do you think there is major you know sort of supply chain risk or shortages like where would you like to see I was about to ask this we we we heard something that like a large portion of the transformer supply chain not the algorithm to the transformer the physical infrastructure comes from China and maybe there's a risk to the supply chain with the trade war there uh would love to know what the key inputs are outside of we all know power we all know Nvidia GPUs But what else could we be constrained on?

Whether it's cement or transformers or copper, I don't even know. Lay it out for us. Yeah. I mean, you know, the bottlenecks move around. Uh, you know, the bottlenecks for AI infrastructure builds kind of move around. You know, you sort of had this moment of uh infinite demand for H100s when they first launched.

And, you know, I think Elon famously said that, you know, it was way easier to acquire illegal drugs than get an H100.

and uh uh so so so you know it's it's rapidly shifted into energy and data center capacity and and what does it mean to actually build that stuff out um you know high voltage transformers are definitely like a big bottleneck um a lot of that capacity does get built in China um you know there it's a diverse enough supply chain that you know I'm not that worried about you know trade war kind of impacting uh high voltage transformers but you know they are kind of long lead time assets um uh you know outside of you know there's a whole stack in the in the in the in the transformer side too.

So like you have the medium voltage transformers uh switch gear can be like a a major long lead time item um that Cruso actually started manufacturing in-house. Um so we have factories in Tulsa, Oklahoma in in uh right outside of Denver, Colorado.

When you say switch gear, is that like network switches like Ethernet routing or something else? Oh, sorry. Um it's it's electrical switch gear.

Um, this is basically like your, you know, electrical room that has all of the uh uh, you know, breakers uh that that feed into uh the uh the uh the actual data halls that are powering the Yeah, it's like a power strip.

You plug it in the wall, you get six outlets out of the back kind of like that, but like the big version of that. Is that right? It's kind of like, you know, it's kind of like your uh, you know, it's it's kind of like your uh, your breaker box in in your house. like you trip a breaker. Yep.

You got to go down and you got to go flip the switch. It's like that at, you know, a gigawatt scale data center something. Yeah, that makes sense. Um uh but switch gear is definitely a bottleneck. Sure. Chillers is another big thing.

Uh you know, I think an interesting trend in in data centers right now is uh with the with the uh introduction of the GB200, the new Nvidia uh uh chip. Um basically you're seeing this massive transition to uh liquid cooled uh computing at significant scale.

Uh so you know a lot of the you know government labs and high performance computing communities have been experimenting with things like immersion cooling you know singlephase and two-phase as well as you know water cooling uh DLC uh for decades. Uh but no one's ever done it at the scale that's unfolding right now.

Uh and the reason it's happening is because you just have so much energy density uh so much heat being produced by these new NVIDIA chips um as we move on to you know uh more advanced architectures uh that there's you know simply just not enough uh heat capacity basically move that heat off the chip uh from a traditional uh uh aluminum heat sink um or something that's like so big that you know it can you talk about the life cycle of water in some of these these AI factories.

We had somebody on the show, I don't remember their name, but I do remember that they said, "We don't have enough water in Abalene to to run, you know, these data centers, and that didn't quite feel correct. " So, I'm not going to call them out. Is that a bottleneck? No, it's not. Um, it depends on how you design it.

Um so we've tried to um you know I think uh water can be a very sensitive topic depending on the communities that you know you're engaged in and Cruso's always tried to be a you know phenomenal partner to you know the local communities that that that we're working with.

Um so you know the way we've designed our AI factories is uh what's called a closed loop architecture. Yeah.

So that means basically you have cold water that flows into the rack and it flows over the chips over over this you know through this through this copper pipe and you have this heat exchange uh between uh the silicon that goes through through the copper and then to the water and then hot water sort of ex exhausted from from the rack.

That hot water then goes out to uh a heat exchanger. Um that's a that's a chiller that's outside. And it's basically just you can kind of think of it as like a a massive maze of of uh copper uh pipe and then you like blow air over it.

You try to blow cold air over it and then the heat basically gets exhausted um out of the water and then that and then you basically have cold water from that that then feeds right back into the system. Mhm. Yeah. I think people 1 million gallons of water per building. We only fill it one time, right?

It's not like we're using a million gallons. Yeah. And this this is what this is what the media has implied is that like every time you make a cute studio Giblly image, you're like dumping a gallon of water, you know?

It's like, yeah, a gallon of water might flow over the chip while you're doing that, but then it flows over the next one and the next one.

This is recycling which is not at that kind of scale that are open loop where basically you have a fresh water supply and that is you know you are consuming a little water in that that scenario.

Um but in in in our case you know we've designed it with a closed loop architecture that you know is like a you fill it one time then you're done. What's for closed loops? I love closed loops. Um last last question I have. How do you evaluate your pipeline?

I'm sure you're a very popular guy getting, you know, phone calls and emails from all over the world, but you're building physical infrastructure. It's not like you can just copy and paste, you know, uh what you're doing in Abalene or or some other areas, you know, a million times. So, I'm sure you have to be pretty.

Yeah, I mean, we're trying uh you know, we are we have a couple other projects that are underway um that, you know, uh hopefully we'll be able to talk about you know, more soon, but you know, uh similar scale or bigger uh is kind of like, you know, what we're seeing.

Uh so, you know, a lot of demand, you know, unfolding, you know, within the ecosystem. And, uh, you know, I think we we try to be thoughtful about our partners. I mean we we really ultimately want the space to be successful.

You know I I view AI as a generational opportunity to uh transform you know human prosperity around the world and uh we just want to help make that happen and you know we we don't think any one company's going to do it alone.

We want to help support the entire industry in terms of uh making this technology successful scaled and and and really rolled out to the masses. Makes total sense. Fantastic. I have a ton more questions, but we'll have to have you back on because this was a fantastic conversation. We'll talk to you soon, Chase.

Thanks so much for stopping by. Cheers. Great to have you. Take care. Before our next guest, let me tell you about RAMP. Ramp is uh ramp. com. Time is money. Say both. Easy to use corporate cards. Almost in tears earlier. Tears of joy using ramp travel. It is the most amazing product.

I know this is such a shill that No, it's not a shill. It was off air. It's an incredible It's an incredible John was saying it is a beautiful thing that I can get IMO gold medalists to make me my travel business travel app. Yes.

Any I mean when we talk about software that is lacking you just the most obvious example is like the airline app, right? The airline app is notoriously buggy and you're not logged in. Ramp travel saves your all of your information immediately. It shows you all the flights across everything.

You just click one button, it just books it and then it's just boom, you're just booked. No, no chasing receipts. No chasing receipts because it happens inside of ramp which is amazing.

But even aside from the expensing, even if I had to do something else on the expensing side, just the experience of actually booking on is so much easier. It's like I mean I wish I wish we had it on camera because it was a really special model. Leaked leaked leaked. I'll be right back.

Anyway, um our next guest is coming in the studio. Um but first, let me also tell you about Figma. Think bigger, build faster. Figma helps design and development teams build great products together. Go to figma. com to get started. Um, we have our next guest coming into the studio.

Uh, Factory AI Matan, welcome to the stream. How are you? Thank you so much for having me. I am good. How are you? I'm great. Uh, thanks so much for hopping by. Uh, can you kick us off with a little