Tomasz Tunguz: AI data center buildout now exceeds 1% of US GDP — echoing dot-com infrastructure boom
Oct 22, 2025 · Full transcript · This transcript is auto-generated and may contain errors.
Featuring Tomasz Tunguz
later in the show. We're talking Chad in one word. Yes, Jeff. Jeff for sure. Uh also Finn. ai, the number one AI agent for customer service, number one in performance benchmarks, number one in competitive bayoffs, number one in ranking on G2. Our next guest is already in the reream waiting room.
We're gonna bring him into the TBPN Ultra Dome. Tomas, how are you doing? Good to see you again. It's been too long to be back. Thanks so much for hopping back on. Uh you've been lighting up the internet with a bunch of uh hotish takes. Uh a lot of takes I agree with, but mostly just very thoroughly researched takes.
Uh and so not something you see a lot. Not something you see every day. So I I'm very I'm very excited to have you on the show. uh would love to uh just kick it off with taking your temperature on on where you think it's the most important to focus right now. What is the narrative?
Is it these uh complicated structures and the vendor financing? Is it the overall bubble narrative broadly progress on uh token generation or DAUs? Like where are you spending your time focusing? Yeah, we're trying to understand what's really happening, right?
So we wrote this blog post about Nvidia and the 110 billion of vendor financing. I mean I ran the numbers for uh an event the data center buildout is now greater than 1% of US GDP. And we had made this prediction in 23 that AI would generate contribute more than 1% or contribute at least 1% to GDP.
And I was stunned that we're basically already there.
Uh and so we we had this question of how similar LPs will ask investors will ask us how similar is the current environment to say 2000 when you had pretty large network infrastructure buildouts and I was kind of in high school during the time but I remember these companies like Lucent and Nortell and them going to the moon and then collapsing and so we ran this analysis and so we're paying attention to a lot of different things.
We're paying attention like on the on the bull side. We're paying attention to token generation, right? Google has released now three data points and we can start to track at the growth rate there is slowing. Is the growth rate slowing because user demand is slowing?
I don't think that's the case, but maybe there's far pretty significant improvements in overall efficiency. The other thing that we're spending uh paying a lot of attention to is just understanding the debt structures, the use of SPVS on and off balance sheet vendor financing.
Um and so and then the last is the depreciation schedule. So a lot of these data center companies have extended the depreciation schedules almost doubling it. Amazon went from 3 years to six years and then backed off to five earlier in 2025.
And that's important because it's the collateral that's underwriting these loans. Yeah. Uh let's start with the vendor financing. Yeah. It is notable that uh everyone was so hopeful that AI would would in increase GDP growth and it did and it did but it came at the cost of hundreds of billions of capex. Yeah. Yeah.
Um yeah it I want to start with the uh with with the vendor financing because that's uh ringing a lot of alarm bells. A lot of people are going to Lucen story.
I took it in a different direction and I was wondering if you're familiar with the ASML customer co-investment program because in in 2012 TSMC, Intel, and Samsung, they pitched in 6. 8 billion across R&D funding and equity purchases in order to help ASML pull forward EUV lithography and 450 millimeter wafer technology.
And the it was this like odd round trip, but it worked out. And so when when when somebody throws me like the vendor financing is immediately red flag, I I always want to say like, well, it doesn't always end in disaster. And so how much nuance should we be placing on the vendor financing stuff?
How much do we need to be digging in to actually understand what's at risk here? So vendor financing itself is really common, right? You pointed to this ASML example. It's a wonderful thing within the ecosystem.
And the vendor financing fails when you have a closed system and they're all borrowing and lending from each other and there's no net new GDP coming into the system. Yep. Right. And then it just goes around and around. Everybody takes their tax and GDP goes to zero, which is kind of what happened.
And that's not the case. Right. You look at what I mean the revenue growth of Open AI. You look at the enterprise spending that's happening. There's GDP that's coming from labor spend. There's GDP that's coming from BO operations that have been executed in foreign countries that are now basically being reonshored.
And so I I don't like I think it would be too simplistic just to say vendor financing uh we're I'm calling the top. That's not the case at all. It's just the expectations around 500 billion a year in capex and the return on invested capital and the depreciation schedule.
I think the thing that we're the the bigger question is how would the bond holders make out Yep. on all this, right? Do you have where does the risk actually live?
Because I feel like there's this world where uh you have open AI is kind of rolling a 20sided dice and as long as they don't come up with a one, they're going to be fine while everyone else is flipping a coin and it's 50/50 whether they make it out. Okay.
and and everyone's everyone's playing the same risk game, but the risk is just way higher for some of the folks who are further out in the debt curve in the capital stack, more risk on, more debt laden. And a lot of a lot of the narrative is like OpenAI is taking on all this debt.
It doesn't actually feel like they're taking on that much debt. It feels like their partners might be. Yeah, that's exactly right. So, I don't I don't think OpenAI is really taking a lot a lot of balance sheet risk. I think I think it's the it's the lenders who are taking the balance sheet risk on.
And so why is there some reason to be concerned with G? Google ran an analysis on their GPUs in their data centers. They found failure rates at 50 to 70% utilization were significantly higher after 3 years which is half of the amortization or depreciation schedule. Then you have next generation chip architectures.
SGI just announced yesterday Cornell research that showed 90% improvement relative to GPUs with a new chip architecture and then a second generation that's promising 100x performance improvement from there.
And so fundamentally underpinning all this is how fast are tokens growing which is an inflationary force in the ecosystem and then the deflationary force are algorithmic improvements and chip improvements and so what is the net you know what is the vector that comes out of that? What is the slope?
We don't know the answer but uh it seems like both are growing very very fast and so it's hard to predict.
Do you think there's like much more confidence from on the inside or do you think that uh internally some of the dealmakers are uh actually viewing hey this is a leap of faith but we're all taking a leap of faith or do you think that when they when they hear us yapping about them on a podcast they think uh these guys don't know the data they don't know how how solid of a bet this is?
I bet if you're in the inside of one of these companies, I mean, Google and Microsoft both said in separate earnings announcements this year that they were hardware constrained and you just watch the consumer adoption. I think OpenAI will reach like a billion in MAU here faster than anything.
And so I think if you're on the inside, there's very little reason to worry because you see the ultimate demand. We were, you know, first wave of all this AI was just better search compressing all of human knowledge into an LLM.
And now we're at this tool calling or agentic phase where it will actually start to do work for us and it works pretty well some of the time. Yeah. Um, but if we can improve that meaningfully, then the demand for token will go tokens will go through the roof. Um, I have another question.
Uh uh I'm trying to understand how people feel about the ramp of agentic commerce because in the limit if I think about a billion mauus who people are purchasing things they're using chatbt regularly they're just going to wind up buying things and taking even a 1% cut of that commerce through an affiliate revenue or a stripe like tax or some sort of ad auction to see Will will you buy it on Walmart or Etsy or Amazon or somewhere else?
Like just just even even not just surfacing like a direct advertisement for something you're not shopping for. Um that feels like that could be very lucrative on an ARPU basis.
Uh but I'm wondering how long it will take to that for that pattern to actually manifest if people aren't in the habit of searching because over the past two years it's been yeah maybe you'll chat something but then you have the natural flow of like oh then I go to Google to order it or then I open Amazon to order it.
So how do you think about that ramp? Well super exciting time.
You've had the ad market historically dominated over the last 15 years by Google and uh and meta right 250 billion on search about 265 billion on social and so it's been really hard candidly to invest in the advertising ecosystem but now it's wide open I think the agentic commerce is is uh will be have pretty significant tailwinds here because you are in the flow you are researching I mean I don't know about you guys but like the number of websites that I visit compared to say 5 years ago has fallen off a cliff because I' instead of going through all the forums to figure out what is the bike the best bike carrier for the family I just ask and then great send me the Amazon link and I'll buy it.
Yep. So maybe there's an ads model here. You know we had we had kind of contemplated whether OpenAI or others would run a keyword auction to inject ads into the context window. So that's the additional information in addition to the search query. Sure.
Um, you know, one of the crazy theories that I've been wondering about is I was chatting with a friend who suggested, well, what if OpenAI takes a 30% cut, not a 1% cut, but what what if it looks like the Apple If you ask if you ask a retail brand, what percentage of Meta what what how much of your revenue does Meta take?
It's probably like 30%. Yeah. Yeah. And if the Yeah. And if the performance is better, then merchants and retailers will pay it. Yeah. How do you think about uh direct agentic commerce affiliate? like I'm searching for the best bike rack.
It gives me a bunch of options and then at the last second it runs an auction to see what, you know, provider will actually send that to me and they take a cut there versus I'm searching for history of the Roman Empire, but it knows that I'm interested in shopping for a bike.
And so in the middle of that feed while I'm reading a deep research report about Rome, it says, "Hey, do you want to check out what that bike you were looking at earlier?
" Because the Instagram flow, yes, there are people that search for camping gear and they scroll and there might be an ad that matches up, but a lot of times you're just looking at family photos from friends and it says, "Oh, here's the thing that we know you want and you might buy it, right?
" So, those are two different kinds of ads, right? So, you have ads, let's say, within the search and then the finding an artic or targeting an ad to you about a bike rack when you're reading about the Roman Empire is called retargeting. Yeah. And it's t typically done from search. Hugely successful advertising program.
I think both exist. Yeah. And you typ it it depends on how considered the purchase is. So everybody has a different level of willingness to spend before they say consult their spouse or take some time to think about it.
And so you have these impulse buys and then you have I think retargeting works very very well for considered transactions like a car. If you're really interested in the next electric car coming from Rivian, well the ad at that point is actually pretty useful for you.
Uh and one of the things that we I learned in the advertising business was seven impressions was kind of the key to building a brand. So there there is value in both the brand building component, the retargeting component as well as the immediate transaction part of an ad unit.
Do you have an idea of I I mean if we're just out if we're just to to play out sort of like a median case for like a bubble popping you might map uh you might map open AI to Google you might map uh anthropic to Amazon companies that made it through the com bubble maybe there's some you know overbuilding that happens and you get elucent and you get some big companies left in the wreckage but what is advice to you know if you're talking to a dot entrepreneur who raised you know, $50 million, doesn't quite have product market fit, it's 1999.
You now know that the bubble's about to pop. What advice are you giving a founder that, hey, it's raining right now. It's everything's really great, but it might not be that way forever. How are you setting up your business for uh durability in potentially a rocky time? Great question.
First, it's to bolster the balance sheet.
And I know this goes a bit against the grain of don't raise too much capital, but I think what we learned in 21 is the companies that had pretty significant balance sheets were the ones that were able to reinvent themselves and weather the storm either by acquisition or new product.
And so I think the cost of capital is incredibly low. So you should be shopping for capital if you can. And then the second thing is I think there's this notion that product market fit is this binary condition that once you pass the gate, you're done. Mh. Um, and that's it's no longer the case.
Product market fit is this is this status that you need to it's like a United miles you need to continue to fly to maintain 1K. Um, and so and we saw it between 21 and 24 companies that were classic software companies literally overnight lost their product market fit. Yeah.
And that's that'll be very much the case again if do you think the labs have an incentive to get various players to overbuild?
So then they there's like a basically like yeah they could they can they're we're a buyer they're basically a buyer at any price right because they're GPU constrained now but if you can collectively get people to overbuild then two years from now you might have dramatically lower costs.
And I think you're already seeing this in some cases where like actual GPU prices are through the roof, but then the leasing prices are quite a bit lower. But I'm curious if you think that uh kind of playing out the game theory there.
Yeah, I mean I think so there's, you know, I think if they want long-term relationships with their capital partners, they will probably want to game it a little bit, but maybe not too much. Yeah, they definitely want to see a little bit of excess. And that's okay.
You look at the CPU spot versus contract markets and those there's excess CPUs on AWS and Google and the margin structures there are really good.
I think one major question with the GPU market is you really within the large hyperscalers you do see great utilization and there's a question around the neoclouds of what utilization are they seeing and what the United economics are there but overall I I think they probably won't look to burn uh others just because they need these relationships for 20 to 30 years.
Yeah. Uh Oracle is down 17% in the last 30 days.
uh do you think that will hinder their ability to actually deliver the capacity that OpenAI is saying just in the sense of like you know they were raising debt and a lot of people are saying like they're basically properly leveraged maybe shouldn't go much more from here uh but I'm curious like do they just need to go to the 20 20 you know 2040 forecast you know 20 2050 just just keep moving RPO out to the end of the century why not why Uh yeah, I think I mean it's it's a good business, right?
It produces a lot of cash. The information article around um the 10% roughly gross margin structure on some of these contracts I think is what catalyzed that drop in price. And so yeah, but it but I but it did it did drop dramatically and then it's just been chugging downhill down. Yeah. Yeah.
Yeah, I mean, look, these are big long-term contracts with unclear demand and the demand is growing really fast. And so, I think if you're an investor, a long-term investor, you have to be hedging at some point. Yeah.
There's also a lot of like hype that comes out when there's like a new biggest number, everyone is like, "Oh, I I didn't even know that they were in the game, and now they're at the top of the game with the biggest number. " And those people pile in and then they might get, you know, jitters.
Like the whole detail of that margin was the the nuance there was that well of course you pay for the GPUs before you start making money from them because you that's like any manufacturing process you buy the raw materials before you make the widgets.
Um but you know people obviously saw a lot of jitters in that did the totally yeah go for it. Oh, I was gonna say I think there's this great parallel.
I don't know if you guys read the book Barbarians at the Gate, which was about the LBO of Bu RJR Nabiscoco and it was the great it was the largest buyout at the time fueled by the junk bond wave for Milkin, right?
And um and then that kind of one, it created the LBO asset class in a very real way, but two, it was a big contraction. And so I wonder is will we have a Barbarians at the Gate book written about some of these major data center contracts? really hope so.
There are so many OpenAI books coming out and I know that they're all just going to focus on like the doomers versus the nonprofits for Yeah.
And it's going to be intriguing and stuff and it'll be like social network vibes, but uh I want I want uh I want like an Andrew Sorcin uh book about this more than the salacious like journalist take of like OpenAI is bad for whatever reason. Yeah. The business history. Exactly. The in the room. Yeah.
I I I read uh the Caesar's Palace heist.
I don't know if you've read that about uh Apollo and it's just like the most nitty-gritty on like the debt restructuring that it's another LBO of the casino empire and it's just like super fascinating to me pretty low tam but I hope we get one of those books just about because some of the novel deal structures that happened at openi every single turn even going back to the the Microsoft deal for a billion dollars of credit we're actually going to reinvent we're going to reinvent the wheel they reinvented the wheel 12 times it's not just the the the nonprofit and the for-profit it's way more complicated.
Every deal is like units and crazy and then now these deals. Uh it's all a lot of fun. Did think about like the Google ownership in Anthropic and then the Amazon ownership of Anthropic and what does that mean when you have two strategics each owning 10 plus?
That's why when when the Google uh news was was hitting the timeline yesterday and that they're working on some type of major deal. I was like why are you surprised that a double digit percentage stakeholder is like working on a deal? Yeah. like with with their portfolio companies. Like that is like the obvious thing.
That's like famously bad advice though. Like if you're if you're a startup, it's usually like don't let uh don't let a strategic build a 10% position in your company unless you're actually planning to sell from them because it will preclude you from working with the other potential strategic buyers or investors.
And that just hasn't proven true in this market at all. It's completely different. Uh did anything about the Carpathy interview uh update your kind of worldview?
I I thought it was just very pragmatic and real and and uh you know the the decade of agents point was uh like I think what everybody's experiencing that's like trying agents and talking to people. There's some that are great. There's some great coding agents. We have deep research. That's awesome.
But uh yeah, we need there's clearly going to need more time needed to to figure out a lot of these uh use cases. But uh I'm curious what stood out to you. Yeah, I think reinforcement learning was kind of the, you know, song of the summer and um there are two different approaches.
one is uh human labor creating virtual environments or gyms where AI can train to understand how to update your CRM and then there you know standard operating procedure-based textbased uh we call them context databases and there are two those are two approaches we've we've written an article I think the first one on the topic about context databases and so we definitely agree with him there that the current approach of creating uh these gyms uh mirrors the finetuning of the previous area And what I mean by that is if you take a regular model and fine-tune it and customize it to a particular task, it's really brittle.
By the time the model updates or the tasks update, you have 30 to 50% of the prompts breaking, which is what we experience today in tool calling. And so I think he's he's largely right. The amount of research that's happening within reinforcement learning, which has two parts.
The first is making a plan, and then the second is creating what are called reward functions. like Mario gets 500 points for eating a mushroom or how many points does a robot get for filling out your CRM. The first one we can do with AI. The second one is largely still research.
And the amount of work or just like academic brain power that's going behind there is enormous. So I I really hope that we make strides faster than what he said.
And I mean I'll tell you like I was trying to replicate cloud code myself with open source technology because the plain internet wasn't working very well yesterday.
And I realized I realized like when you use cloud code or cursor it's tool calling is just asking the model one question and then taking the answer and then putting it back into the model and then again and again and again.
And there are different loops like Gemini or yeah Gemini CLI says do that 10 times and then here are the success like that's that's that's tool calling it's extremely basic and there's some management of like okay after the first step this is what we learn now go do this but it's um I thought it was much more sophisticated it's and it's quite like it's uh it's very rudimentary and just the way that like we've optimized memory around these models and we can do tasks and uh Well, we we can hit different benchmarks for science and math.
I think we'll start to see that with RL. Although I I think the big question is do we need a new architecture in order to really solve some of these problems and maybe that's what Karpathy was alluding to, but uh we have a long ways to go. Well, thank you so much for coming on the show. This is always a great time.
Let's grab more time next time.