SemiAnalysis' Doug O'Laughlin: Cerebras has a narrow path, Groq fits NVIDIA's GV200 rack, and TSMC is the AI buildout kingmaker

May 14, 2026 · Full transcript · This transcript is auto-generated and may contain errors.

Featuring Doug O'Laughlin

Speaker 2: Oh, had

Speaker 1: a famous controversial stock sale. He sold 200,000 Harkin Energy shares in Okay. 1990 before bad news came out.

Speaker 2: Okay. Interesting.

Speaker 1: Well And there's no no other evidence that we're finding of presidential stock traders.

Speaker 2: Well, we'll dig into it, but we have Doug O'Laughlin from Semi Analysis in the waiting room. Doug, how are doing? Welcome to the show.

Speaker 4: Good. Good, man. You know, pretty busy day. Another day, dude. Honestly, every day is a busy day this race.

Speaker 2: Every day is a busy day. Take it take us through it. How do you think the the market reacted to the Cerebras IPO, to your to the semi analysis deep dive on the company? What is the overarching story here?

Speaker 4: So I think the market was obviously positive. I don't think we're quite as positive as the market, but it's a bull market baby. Mhmm. I I think the takeaway is that Cerebras got to IPO, which one point in time, we didn't think that would happen at the semi analysis world.

Speaker 2: Yeah.

Speaker 4: We're very we've historically been very bearish on SRAM.

Speaker 2: Okay.

Speaker 4: But I think there's a path forward for them to be a disaggregated prefilled chip or maybe even AFT chip. Yeah. Meaning attention feed forward disaggregation.

Speaker 2: Okay.

Speaker 4: So yeah.

Speaker 2: Yeah. Unpack is sort of the the competitive dynamic. Like, with the the the fear around Cerebras as far as I I could tell years ago, it was like, will this ever be useful? Will they ever actually be able to make it? Will it have defects? Then it became certain applications, demand side, customer concentration. But where do you think they are now? Where how has that journey evolved?

Speaker 4: So first and foremost, Cerebras is about SRAM. SRAM is like the fastest possible memory and it's kind of done on a logic process. Yeah. But the problem is SRAM scaling is dead, meaning that you can't make smaller and smaller SRAM scales. Mhmm. So pretty much they like committed to this dead end process by having the biggest scale up world as a wafer size, but then the models got much bigger than just a single wafer. And so they have really, really fast inference, but only a certain size. And I think the real capability problem is can they inference models larger than a trillion parameters? And I think the answer, as we think right now, it's pretty unlikely in the near term.

Speaker 2: Yes. So I I so I understand all that. I I'm just wondering about the world where should I view it more like a CPU? Because when the AI boom, the ChatGPT moment happened, the obvious buy was NVIDIA because we're going to need a lot of GPUs. No one was really expecting a chip shortage in CPUs but then agents wound up using CPUs for a bunch of stuff. You have to keep the GPUs filled. And so CPUs are now in demand. And I'm wondering if there's this world where there's this, yes, you're going to move past the trillion parameter models, but we're going to keep using them forever just like we use relational databases forever even in an AI, agentic AI world.

Speaker 1: Or you have a scenario where you have a big model that that is is giving sort

Speaker 2: of Yeah. Orders. Orders, workloads Delegation

Speaker 1: or something. Delegating to a smaller model.

Speaker 4: Yeah. I think I think in a perfect world where there's no silicon constraints, that might be true. But obviously, there's silicon constraints. And I think Cerebras is really well optimized for a certain problem, and we think they do a great job at answering that, which is fast inference at a certain size of model. Maybe that market's gonna be large enough. And I mean, I don't think I was ever bullish to rebus the entire time. But now that we're here, not ironically, 1% of a very large market works. Yeah. I think they got like 1% of a very large market.

Speaker 2: Yeah.

Speaker 4: When it first started, was like, oh, yeah. What are you going to do? 1% of very large market? That's going to be a few $100,000,000.

Speaker 1: Well, that's and that's that's like the classic seed

Speaker 2: Sounds like it's

Speaker 1: seed pitch too. Yeah. Yeah. Ten years ago. Yeah. Worries.

Speaker 2: Yeah. Is there any is there any for a long time there was a lot of fear around ASICs companies around architecture changes. We're going to move past the transformer and they're all going be locked in the past. Is there a is there any optimism around there's an architecture change that actually is to the benefit of Cerebras and makes them more relevant in the future? Do you think

Speaker 4: that No, that's possible. I mean, my pay grade. That's two gigabrain for me, right? But where I'm at and the understanding Okay. There is a narrow path for them, I think. And I think they're going be able to inference maybe 1,000,000,000,000 parameters, a very small context window sizes

Speaker 2: Yeah.

Speaker 4: Or smaller window smaller models at very, very fast speeds.

Speaker 2: Yeah.

Speaker 4: But I don't know, man. Maybe I mean, like, you know, the true gigabrain take is Mythos is so good or whatever that it makes compute efficiency super easy and Oh. You know yeah. Your your model is inefficient and AI understands.

Speaker 2: Yeah. Yeah. Yeah. Distill yourself so you can run on a Cerebras chip just as effectively. Okay. Now

Speaker 4: we're talking. That's that's the gigabrain thesis. But I think I just think that there is there's demands. Right? Like, clearly, we're in a shortage. And Ironically, in a shortage, it's not the best company who wins. I mean, you can look at NVIDIA's stock chart and that tells you. It's the second, third, fourth best companies where the demand overflows. Right? And so we're seeing all that today.

Speaker 2: Yeah.

Speaker 4: And I think the reality is the market's big enough for a lot of demand and Cerebras is in that space. So they've done a really good job. And I mean, a cool engineering problem. But we think it's kind of a solution looking for a problem because the the world of LLMs blew up at a much faster scale than anyone could have ever thought of. Yeah. The size, I think, is really the difference.

Speaker 2: Yeah. Yeah. Give me a little primer on Grok, how Grok fits into the SRAM machine market, what the view is because it felt like that NVIDIA's move there with the license acquire, as you put it, was defensive against Cerebras. Is that the correct framing? Like, how does Grock fit in on this?

Speaker 4: Okay. So let's talk about exactly where Grok fits into the architecture. So on in a transformer architecture, have, like, the multi heads of attention, and then there's a feed forward network.

Speaker 11: Mhmm.

Speaker 4: That's a portion of, you know, essentially the entire transformer block. And what's become really hot in the last few years, or not even two years, like probably a few months, man, is you've been disaggregating all the different parts of inferencing into subsequent specialization. So we're talking about GPUs and ASICs being a specialization over CPUs, now but we're actually starting to break essentially the constraints of inferencing into different, I guess, compute and memory bounds like pockets. And so for example, we're finding prefill ends up being pre fill being essentially loading all the weights, ends up being compute constrained. So you don't really need a lot of memory bandwidth. So why don't you just use a very FLOPS heavy portion and you disaggregate the memory onto the decode portion, which is like extremely memory bandwidth limited. And so this is Grok where this fits in the strategic thought process here is in the g v 200 rack. What you can do is you can pass the activations over to the and the to the SRAM in the Grok LPU rack, and that is an extreme speed up. And so that's a perfect example of another break apart of the transformer architecture. Pretty technical, but that's the thought process here is that the memory is so fast, The memory band or the the speed of the IO doesn't really matter, and you don't need a huge scale up world size because you're just streaming the activations. Yep. That problem wouldn't work with the Cerebras trip because you're kind of it's it's an island, right? You just think of it as an island of compute. It's really really good at everything in the middle, but moving anything off the island is really hard versus moving something off the island onto a grok chip because there's a plug at the end of it is a lot easier, and that's kind of the calculus, I guess.

Speaker 2: Yeah. So cerebris lower memory bandwidth, lower interconnect speed?

Speaker 4: Off the chip. On the chip it's a fast Yeah. Okay. Yeah.

Speaker 2: So what so so what does that mean for the the the Grok NVIDIA ecosystem? Because is this something where the default configuration is going to be a Blackwell and a Grok chip like in, you know, 50% of racks, 80% of racks? Or is this like still some sort of niche application where Grok is going to be deployed, you know, sort of sparingly sprinkled into specific use cases? Do you have an idea?

Speaker 4: Yeah. I think I don't have an idea with high precision. I think you'll find that a lot of these things there's a lot of different ways to split up and serve your model. So expert parallelism, pipeline parallelism, tensor parallelism. Right? And so the correct optimization per hardware rack is going to depend on the shape and architecture of the model, and we don't really know with high precision what is what. And there's been different road maps along the way in terms of what they wanted to do for speeding up inference. A perfect example of this is the CPX rack, which was mostly built for extra parallelism. It kind of remains to be seen if this is like if the Grok GV 200 speed up is gonna be like the way forward. But it's definitely a technology tree that I think Jensen is excited about. So, I mean, we'll see.

Speaker 2: What about Lisa Su at AMD? Is she excited about this technology tree? Can you give me an update on how AMD fits into all of this?

Speaker 4: So AMD is mostly just trying to get the last thing to work, which is the rack scale up.

Speaker 2: Yep.

Speaker 4: And I think they're going to do a good job of four fifty. I think what's going to happen is that, like, you know, it's a compute shortage, right? So you talk about overflow demand. I think Lisa's going to figure it out. But on the inference serving side, I think there's definitely some demand or desire to probably match the Nvidia roadmap. And I wouldn't be surprised to see if there's some kind of fast SRAM offload FFN chip in the next twelve months. But the thing is the number of candidates there is actually pretty low. I think Intel's Intel's going for SambaNova, which is a little clever. There's like Ethereum two. There's a few other players out there too that pursued SRAM scaling. But I think that in this specific case, Lisa's mostly just figured focus on the last thing. And I think AMD is definitely good enough right

Speaker 2: Okay. On Intel, what is the latest there? It feels like the roundtable has been assembled and sort of everyone has held hands and decided to maybe jump across the the transom at the same time, take the leap of faith. But it also feels like, you know, lithography machines are majorly backlogged, like there's a whole supply chain that they have to answer to that's backlogged. And so really high expectations but also what are the what is the next milestone for them after they actually get these deals with Apple and Elon Musk?

Speaker 4: Amazon and Elon Musk. Yeah.

Speaker 2: The the Gigafab. Sort of like once they get those signed, like, does the next couple years look like?

Speaker 4: I think it's about execution. Mhmm. It's kinda crazy to me that I think the stock price is ahead of the technical turnaround. And I think that I think Lip Bu Tan clearly has, like, righted the ship and gotten the right people onto the party, if that makes sense. And I think I I really do think the government intel deal was a stroke of genius because Pat Gelsinger spent, you know, three years trying to build a bottom up demand to essentially come to the fab. And and Trump's like, yeah. None of this. I'm gonna sign the deal from the top. Mhmm. And what's gonna happen is you're gonna come play because we're in the United States government or else.

Speaker 2: Yeah.

Speaker 4: And so I think I think people are there. I think the customers are there. I think the process is good enough. Think I fourteen eight will be also good enough given how much of a shortage n three at TSMC is, and it's all execution risk from here. But the historical Intel has quite a bit of execution problems, so we'll see.

Speaker 2: Okay. Before we move on to TSMC, which I want to go to next, are there any other interesting ASIC projects on the horizon? We've talked to a few of these companies, but I'm interested in like the shape of the differentiation. Like you explained a little bit of the divergence in strategies between Grok and Cerebras, but there's Etched and a bunch of other companies that are working on new chip designs. And I'm wondering if any of them stick out to you as particularly differentiated.

Speaker 4: You know, I'm not going to go too into the details because I feel like some of them are even, like, still figuring out their Sure.

Speaker 12: Their road map.

Speaker 2: Yeah.

Speaker 4: I think Mattox is kind of interesting Yeah. The way that they're they're kind of trying to pursue the memory problem. I think I think Etched, I'm excited about the kind of YOLO bed, if it makes sense, just make a big systolic array.

Speaker 2: Yeah.

Speaker 4: But I think there might be like niche cases. I think the problem is, at the end of the day, NVIDIA's big bus is still really good for the majority of cases, and you're going to have to start to make really opinionated bets on the ASICs to find what niche market ends up being all like a diverter of demand into their ASIC. And so the ASIC specialization from here, I feel like you have to make some pretty big brain bets in order to make your bets come pay off. And I think most of the bets that would have guessed when you, like, when you originally did them, didn't really wouldn't have paid off. And the ones I didn't expect did. Like, it's kinda crazy.

Speaker 2: Yeah. It is it is a very weird market dynamic where a couple years ago, we saw ASIC and new chip companies, new silicon companies raising hundreds of millions of dollars or $500,000,000 And it was like, well, for that, you're going to need this massive market. Are you really going to flip NVIDIA or something? And then the market grew so much that the 1% of a huge market sort of potentially maths out for some of these companies now. It's a Yeah. It's a fascinating development. Jordan, do have something?

Speaker 1: Yeah. China trip. Yes. Oh, yeah. What are you tracking?

Speaker 4: On the h 100? Oh, so honestly, do you guys see the parade? You know Trump loves the parade.

Speaker 2: Oh, yeah. They're winning the parade. Good parade.

Speaker 4: I was like, dude. Was thinking, Ben, if I'm I'm not much of a parade guy, but I was like, dude, if they show if they showed up in that parade was for me Yeah. I'll be like, these guys could be friends.

Speaker 2: Yeah.

Speaker 4: My my impression is that the executive the executive branch really wants a deal, and I think, you saw the h 200 list, the verified h 200 list. I expect probably more lightening up on the executive branch. Something that's really interesting is if you look on the legislative branch, there's actually more export control bills going through the house than, like, ever in history time. So there's kind of this tension, but I do think, you know, Trump's a businessman. He loves the deal. I expect I expect a deal. So.

Speaker 2: Yeah. Somewhat related, TSMC. Ben Thompson was writing that potentially they weren't ramping CapEx fast enough. What are you what are you tracking on TSMC being a potential bottleneck for the AI build out just as more and more Cerebras is now trying to get allocation, it feels like a particularly sharp elbowed place to do business.

Speaker 4: Yeah. So I think at the end of the day, TSMC is kind of a kingmaker in terms of supply. And there's no reason for them to really let the market go out over its skis. And I think they're happy with the pace of what they're they're expanding out because, like, hey, they're growing their CapEx, like, 40%, but in absolute dollars, these are big numbers.

Speaker 2: Yeah.

Speaker 4: We're gonna run out of TSMC engineers in the island of Taiwan pretty pretty soon here. So I think I think this is all kind of good on the margin for overflow demand, which is actually, it's Intel. Intel's, you know, definitely reflecting some of that, but I think the shortages specifically at TSMC is driven by clean room. It's a long lead time item. It takes three to five year or let's just say three years to bring a clean room up. And so in order for them to have, like, figured out and, like, perfectly matched demand, two years ago, they would have to been like, we have a 10,000 square foot house, and we need to buy a 50,000 square foot house with conviction. Yeah. Right? It wasn't that clear two years ago. And so I'm gonna expect supply to kind of lag over and over and over. But demand signals will continue to essentially command premiums, move up wafer pricing, move up orders, and that's what's gonna make TSMC invest more next year and the year after. But they're gonna do it in a, like, in a incremental, not a revolutionary way, but, like, an evolutionary way. They they are very, like, methodical and do steps one at a time.

Speaker 2: Okay. Cleanroom fungibility. When you say it takes five years to build a clean room? I I immediately go to SpaceX. I imagine that Elon can build big things quickly. Is there some world where that partnership accelerates Intel regardless of your timeline for the mass driver fab on the moon, all the crazy long term stuff. But just having Elon around the table to say, oh, we need to build something big and it needs to be, you know, capable of operating as a fab. Like, is there something where he brings more to the table than just dollars potentially?

Speaker 4: So I definitely think Elon is the man to do it. Mhmm. I forgot who said this, but like Elon makes the impossible late. Yeah. I don't expect it to be on time. Yeah. You know, talking with the cigar in in in the terra fab, I'm really kind of doubtful. It's, you know, I guess from first principles, it's easier to just clean the entire room than to make like really hyper concentrated pockets. And that's what I would guess the bet is. But I still think by the time Elon figures it out, the supply response will have reacted already. Mhmm. We're still two, three years out, and there is some cleanroom fungibility. But and and you've already seen this, actually. Micron bought an old power fab. I think this is the TSMC deal. People are buying display fabs. Essentially, every bit of clean room that is not accounted for in the world is being snatched up and retrofitted to kind of meet the supply demands.

Speaker 2: Interesting. Yeah. That I mean, that's happening all over. Didn't Ford just announce some sort of AI play today? The stock's up on something. It's it's all over the place. I am interested in in terms of like

Speaker 1: 6%.

Speaker 2: Getting getting powered shells

Speaker 11: Oh, Ford is worth

Speaker 1: more than Figure now because last year, around a year ago, I remember Yeah. Figure was worth more than the Ford Motor Company.

Speaker 2: It is rough time. But now they're both AI AI companies, I guess. But what are you tracking on the American data center build out domestically or terrestrially before we move on to space capabilities? Yeah.

Speaker 1: Basically how Yeah? Oh, go for it.

Speaker 2: No, no, no. Just No, go for I'm just curious about I mean, we're starting to see glimmers of pushback at the municipal level, different data center bands. And I'm wondering about what are the big levers that are that need to get pulled to actually continue to bring capacity online in America?

Speaker 4: Yeah. I think that's a good question. And you're already seeing the first level of this is the delays. Yeah. I my my favorite clickbait is 50% of all data centers in America are are delayed or canceled, implying 50% is canceled when it's really just everything is delayed. That's like my favorite clickbait. I got to steal that in the future. But I think it's going to be local municipal, and people have to really believe and demand and desire the jobs. And I think one of the ways that we're seeing this is like, you know, capitalism works, and effectively, the dollar per megawatt has been going up. It's like a one way train. In the same way that, like, you know, the power per rack has been going up, the cost of making these data centers have gone up. And one of the ways that happens is it leaks into labor. Right? So essentially, you're super against it, but all of a sudden it offers 3,000 new jobs to your home and you're like, well, maybe maybe I'll take it. And I think that with enough economics, oftentimes, you know, money finds a way, and that's kind of that's kind of how I would guess. But it's gonna be like a it is a it is like a county by county fight. Right? Yeah. And some places are just gonna say, hell no.

Speaker 2: Yeah. On the note, we were debating this earlier today. There's been a couple of examples in like viral photos and articles about like they I I bought a beautiful house in the countryside and then they built a data center right next to it. And, you know, no matter how pro AI you are it sounds annoying to have a huge building that's an eyesore and maybe noisy, maybe smoky next to you. Have you been tracking like how how how feasible is it just to throw the data center truly in the middle of nowhere? It feels like America has a lot of land, but what goes into selecting data sites these or data center sites these days? Do you have something else? So yeah.

Speaker 4: So I I think pretty much two fiber pairs is the big desire. Essentially, it's like you're more than willing to go to where the power is because you have to go to what the biggest actual bottleneck is, power is the biggest bottleneck. So you can just In the past, you're talking about like, hey, having these inference or rather, like, let's say, point of presence near, local cities. Right? But power was never a constraint in that world. It was just, you know, the biggest constraint was getting this video from TikTok to your phone as soon as possible. If the biggest constraint and the largest part of the cost is gonna be power, why not move the data center to power and then then, like, you know, essentially hook it up with fiber? Yeah. And so I think that we're gonna put them in the middle of nowhere. That's just how it's gonna work. To a certain extent, there's gonna be more densification in some of the inference near the population, but I still think the ROI makes the most sense to kick it out in the middle of nowhere.

Speaker 1: Yeah. Has the political backlash, pushback updated your thinking at all around the viability of space data centers? I I remember, you know, we talked as this idea has like gained popularity, you guys have like consistently said, yeah, technically you can do that but like maybe it won't be

Speaker 2: takes a long time for the

Speaker 1: There are space data center players now that are kind of loving the pushback against terrestrial data centers because they're like, the the more pushback there is, the more it could make sense for us to put this put these, up in space. But what's your view?

Speaker 4: I still think economics is gonna win out. You know, something a pound on Earth is probably ten ten times more expensive in space, And it's really hard for us to go to essentially beat that out with a new completely specialized supply chain for what's going to be a smaller market in the near term. It's a real adversary against the adoption in, like, let's say, short run. In the very long run, because I'm sure you saw the Anthropic Colossus thing where it's also interested in space, right? Like, the biggest maxi vision of this is AGI. We have 30 ter you know, we've a thousand terawatts of GPUs on Earth, and we're like, we gotta put a terawatt in space. Right? So, like, in that world, I think space space data centers work where a small percentage actually gets

Speaker 2: so big. It's It's 1% of the market again. It's just like Just 1%. One per and it's a trillion dollar. It's a trillion

Speaker 1: dollar vindicated.

Speaker 4: Yeah. Yeah. VCs are vindicated. What's Tam

Speaker 1: Tam pitch deck slides vindicated.

Speaker 2: Yep. Yep. Yep.

Speaker 4: Yeah. It's literally as big as the galaxy, bro. Just there's no end to it actually. Think about how big the TAM

Speaker 2: is. Yeah.

Speaker 4: So I think what is more likely is if it continues to be painful to do it from a zoning perspective in America, it will essentially slip into other geographies, probably in the Western Hemisphere. There's a lot of power in space in Brazil, and I think that that's probably good enough. There's definitely ways to make this work. I definitely think the only way you do it is by paying more and finding someone who's like, you know what? I'll hit the bed. And so that's the important part. But, know Yeah. It pulls some finds a way.

Speaker 2: Is that is that sort of the bull case for sovereign AI initiatives? I was always super skeptical because like Europe didn't get like France's Google. Like they just use Google. And for a lot of consumer aggregator type consumer internet companies, it's like Spotify is from Sweden, but it could be from America and it wouldn't matter. YouTube is from America and they use that over there. And you didn't need a, like, a national champion in every consumer category or there were certainly like returns to scale and a lot of the American companies just won. But so I never really bought the whole idea that like, oh, the French need like a locally trained LLM and the Germans also need a locally fine tuned something or other. But if every company every country has some sort of excess supply of energy or space or regulatory capacity for data centers, sort of bringing that online and just operating like a neo cloud could just be economically valuable for that country regardless of whether or not they're vertically integrated to the point of the consumer or the business that's running an AI agent.

Speaker 4: I think that's probably the case where at the end of the day, economics is gonna like kind of push it through. And there is FOMO and Europe did do a lot of investment in the Internet like really late if we're gonna use 1999 as an example. I think the thing I keep thinking about is that this AI thing is going to be a big deal. I continuously am shocked and surprised by the magnitude and scale. That's a derivative violation.

Speaker 2: I don't think it is right now. I I I feel like we are in a particular moment where

Speaker 1: No. There's just the

Speaker 2: people calling the top in the bubbles, like, they're awfully quiet right

Speaker 1: now. And that's makes me even more scared.

Speaker 4: Yeah. That is on okay. So to be clear Yeah. You know, the the true top, there's no everyone's bullish. Right? Everyone's like, dude, it's actually gonna be bigger

Speaker 8: next year.

Speaker 4: Yep. It's actually just gonna be a bigger bubble, so shut up. Yeah. So Yeah. I was not

Speaker 1: concerned about I was not concerned about a bubble when everyone was saying

Speaker 2: It's a bubble. Yeah. Exactly.

Speaker 4: I am I mean, I'm I'm a little concerned it's a bubble, but at this point in time, I think if you look at the big I've been I've been reading

Speaker 1: here's my view. Please. It's not a bubble until you guys are spending a 120% of revenue on tokens.

Speaker 4: Yeah. Our gross margin goes negative. Yeah.

Speaker 2: He's just like, we're raising a major fuss. We're not gonna be investing in it. We're gonna be burning

Speaker 1: it. It's actually not a bubble until semi analysis goes public and trades up 600%.

Speaker 2: There we go. I like that.

Speaker 4: That's the that's the real talk. No. No. I think there's a few things that have to happen. I think OpenAI or Anthropic, someone has to go public, and it's gonna be this year. Like, we have, like, we have to hit that keystone before before it's all over. But also I also think I keep thinking about this as like, dude, this is a big a big technological revolution. Yeah. I think it's bigger than the Internet, and I I firmly believe this. I don't think I believed it'd be bigger than the Internet when I maybe even two years ago, but I'm pretty convinced this can be bigger than the Internet. And if you look at the past, these big technological changes are often sometimes bigger than, I don't know, everything else. It reshapes the entire world. For example, on the sovereign AI thing, maybe you're like, yeah, you don't need to fine tune LLM, but what happens when AI becomes such an important fundamental, almost like society level institution that a government can't control it, that becomes really, like, uncomfortable and weird. Where it's like, hey, Anthropic can just, you know, put 5% of the compute of mythos and, you know, run a really effective effective government, you know, whenever you want it. And you're like, woah, what does that mean for us? Yeah. And so this wave is so big that I think people are going to, out of fear and concern that they're going to be left behind and that the institutions that AI will bring is going to be bigger than the original thing that we're doing, I think that that's the problem. Right? Like, the industrial revolution changed everything.

Speaker 1: Yeah. The other thing that we were joking about in Q4 of last year is John was like, great. The bubble pops. Oh, yeah. Like the bubble inflated and then it pops, but then we got agents and then you have this sort of like reacceleration of every metric across the board. And so the other thing that we're like, we're trying to comp the AI boom to the Internet. But the problem with the internet boom is that we didn't have the internet. So everything just took like or the internet was coming online and people were getting access to it. And so the entire build out and all of the capabilities and all the companies took a lot longer to sort of grow, right? And now you have that core infrastructure and so when you're layering on more infrastructure that accelerates all the underlying trends.

Speaker 2: Yeah. Yeah. Well, I mean, the labs the lab revenue multiples are like an order of magnitude or two off of .com peak multiples. In the public markets, Google, Amazon, Apple, all the hyperscalers are at like pretty reasonable price to earnings multiples still even with all the CapEx and stuff. So

Speaker 4: Your put the pushback would be it's on free cash flow that

Speaker 2: Yeah.

Speaker 4: You can make earnings look good instead of free cash flow. But, like, I think the revenue continues to be real. The demand continues to be real. And until you just, like, see demand evaporate, like

Speaker 2: Yeah.

Speaker 4: It's hard for me it's hard for me to sit here and be like, GB prices are up a ton. Quadcode is really valuable to me. I still think I'm an early adopter, and, you know, this is all gonna end tomorrow. I envision myself using it every single day more for the rest my life, which is kind of crazy, and I think I'm an early adopter. And so I just think it's hard for me to envision this not being a ginormous deal, and it's kind of like we just got the like, I really I wrote this whole thing about, like, angles, pause or whatever. Like, it's gonna change everything. Like, the the the amount of net output that's gonna increase is going to just blow up our minds. It might be bad for GDP, ironically, because GDP will be unmeasured. Like, we're gonna like, GDP might be broken as a concept. GDP got invented in the nineteen thirties to measure how much output you could make, to not screw over the domestic economy for World War two. Like, it was it was a way to essentially organize the the, the American economy, and it's a statistic. It's an estimate. Like, I think all of I think we're gonna, like, attack in, like, a lot of institutions and ways that we're doing things and ways we measure are gonna be attacked by this because it's, like, such a big change. We have to rewrite the playbook over again.

Speaker 1: And people and it's and it's funny. I think wasn't Ben Thompson was talking about this in a recent interview of like, people are comping this like, okay. Silicon Valley, like, you know, brought crypto

Speaker 2: Oh, yeah.

Speaker 1: Online, and then it wasn't maybe as big as some people had had pitched it to be even though it's been

Speaker 2: Yeah. Self driving cars Super powerful. VR.

Speaker 1: And then and then even the way you're talking, you're like, you know, we're we're still early, classic crypto But the promise is

Speaker 2: you are early, you have nothing You

Speaker 1: know, in crypto it's like, well like a community could have a DAO and that DAO Yeah. Could be worth a billion that community could be worth a billion dollars but there's just no way to measure that. But now we have tokens and you're saying GDP. But anyways, I'm trying to like unlearn I think some lessons from that cycle because Yeah. There are a number of things that are quite different. It's also What about what about

Speaker 2: the reflexivity that that people do have a little bit of an immune system to just running away with everything because you you you could you could believe this and then bid, you know, Nvidia to 10,000 times earnings or something. And, like, at at certain point, you have to start grappling with the reality.

Speaker 1: What about robotics? Has Figure had a major breakthrough?

Speaker 4: I mean, I one, I have not been following the feet as close as I should be. I just think robotics feels a little further out than the hype would let you believe. I feel like robotics is much more akin to the driving car paradigm where it's like, oh, yeah. It's definitely gonna come and automate everyone's jobs, and then it takes a lot longer. It's a lot, like, unsexier. I think the the the scary or positive thing about AI is since it's information work and it's already been distributed and it has the perfect network to run on, which is the Internet, it can disperse very quickly. And that's what we're seeing right now. And so, yeah, I I'm I'm just not anywhere near as bullish robotics as I am

Speaker 2: Yeah. The fundamental for you. Well, I'm bullish on the next seminalysis. I don't know. What are cluster max and inference MACs? What what are those called? Dashboards or analyses or rankings?

Speaker 4: Dashboards. Dashboards now. We everything's a dashboard, brother.

Speaker 2: It's a Well, you need be a

Speaker 1: new dashboard.

Speaker 2: GTP, gross token production. This is what we're measuring now. This will be output of The United States. Gross token production, GTP.

Speaker 4: We need to I mean, I think more on this soon, actually. This is, a place we're doing some research on.

Speaker 2: Yeah.

Speaker 4: But I think, you know, the real the real bubble metric is if we're, how many tokens? What's the token Yes. Yes. What's the token replacement cost? That would be some really good bubble math where

Speaker 2: it's like,

Speaker 4: yeah, yeah, a software company has really low token replacement per market cap, but a hardware company has an extremely high token replacement cost. And then it's like, oh, no. No. It's just enterprise value divided by token replacement cost.

Speaker 2: Well, the real the real bubble one will be to go to the full Merry Maker, like, eyeballs metric, eyeballs multiples. Yes. So you will value companies purely on token consumption. You'll say, oh, well, they're consuming 10,000,000,000,000 tokens, so they must be worth a billion dollars and then you'll get really weird gyrations.

Speaker 4: That'd be great for seminalysis. That'd be really good for seminalysis. We are we are consuming a lot of tokens.

Speaker 6: Well,

Speaker 2: you're also putting a lot of stuff. I really

Speaker 1: enjoyed the Would you guys ever make a sort of political style attack ad against another research firm for having AI psychosis?

Speaker 2: Is that a reference to

Speaker 1: the Sorry. It's a reference to General Catalyst attacking

Speaker 2: Andrew

Speaker 1: Feldman. Andrew Feldman.

Speaker 4: Andrew Feldman? Yeah. Know, life's

Speaker 2: I pretty actually think is peerless. I I I don't think there's like a neck and neck with someone else. Like, it's just you guys.

Speaker 4: I'm not. I yeah. I was gonna say, I don't really know who our competitors are. Yeah. I don't, you know, I don't really think about it, Mark and Jason or or, you know, another research firm like that. Maybe one day, maybe we will go through AI psychosis. Honestly, guys need an You

Speaker 1: need a guys need a arch nemesis. You need an op. Moody's.

Speaker 4: I guess it would be Gartner. Gartner had to say they're like, but this is not a good offer. Need you

Speaker 2: need a semi analysis hype cycle and it's up only. No no trap of disillusionment. Straight line. Yeah. Straight line.

Speaker 4: No axis. And it's it's actually going backwards.

Speaker 2: It's a straight line on a log graph. That's what it is. Perfect. Semi analysis hype cycle. I love it. Gartner doesn't stand a chance, but thank you so much for coming on the show. This was fantastic. Always

Speaker 1: Full analysis.

Speaker 2: Full analysis. Yeah. No more semi analysis.

Speaker 4: Those guys would kick our

Speaker 2: ass, man.

Speaker 4: If they had full analysis, they'll kick our ass.

Speaker 12: It'd be so over, man.

Speaker 2: It'd be over.

Speaker 4: So, anyways, take care, guys.

Speaker 1: Have a great day.

Speaker 2: We'll talk to you soon. Cheers. Bye. Next, we have Andrew Feldman from Cerebras joining in twenty minutes. We'll go back to the timeline because the OpenAI Elon Musk trial is in its final day. The trial is ending. People expected four weeks of trial. We only got three. They're cutting it short. What are the prediction markets saying about who's going to win? I want to know that. And I want to go to Mike Isaac, the rat king, because he has a breakdown of what's going on. He says, good morning. Closing arguments of Musk versus OpenAI with special guest Microsoft are happening today, Thursday, May 14. Again, my guy is like, of course, he kicks it off with what his lunch is. He's got an epic bar. He's got the bison snacks. He's got a la cologne latte. He's got a couple other good things. He looks like he's prepared. He's got a bunch of snacks. I feel like he's in a better position today. Learned his lesson

Speaker 1: three weeks of complete self improvement.

Speaker 2: I think that's what's going on here.

Speaker 1: So The Cal sheet. Well, Elon won his case against OpenAI. It peaked at a 58 chance. Okay. Where is it now? April 28. It's now sitting at a 30% chance.

Speaker 2: 30% chance. Okay. So right now, the judge is instructing the jury on the criteria by which they should be judging the outcome of the case. Important because if the jury listens and carries this out, it is a very, very specific lens through which they view all the evidence. Ostensibly, it's where theater ends. Listening to this and being read out in court for the last twenty three thirty minutes is very helpful because it's clarifying on how high the bar is for the plaintiff's side approving some of these claims. Sort of feel bad for the AV guy during this trial. There's been feedback. There have been mic drops but not in the good way. The mics have been dropping out. Funky video feeds. They need to revamp this place, says Mike Isaac. LMAO, the first joke of the tweet storm. He says, Musk counsel is going after OpenAI execs, Altman and Brockman, and has the mugshot style photo of Altman on the screen again. Battle of Photoshops of executives in this trial has been entertaining to watch. You want to depict your opponent in the worst possible light. Musk's counsel going back and forth hammering the point they made over and over the argument essentially painting a picture. Sam Altman? Liar. Chipping away at witness credibility has been a core strategy for the plaintiff's side and we're back to everyone hates Google again. Molo is using Larry Page who they claim doesn't care about humanity as a foil to the noble Musk who only whose only care with respect to AI is the future of humanity. Musk's counsel is painting the Dros don't trust Sam picture in a bit more detail for the jury. Also Musk's side has a picture of Elon and Altman on the screen now. Sam's looks like he's about to be processed by a US Marshall. Musk's looks like he's getting ready for the Met Gala, l o l. Lots of Musk closing side arguments, suddenly populous track of pointing open pointing at OpenAI and saying these billionaires are making gobs of cash while running a charity for the supposed good of the world. I'm curious if jury can register this argument even if it comes from Elon Musk, the world's richest man. Ouch. OpenAI Council begins closing argument with a broadside against Musk. Every even the people who work for him, even the mother of his children can't back his story. Oh, yeah. Back to the war of the Photoshops. Okay. Closing remarks now in in the digital displays and the monitors for exhibits. All the OpenAI executives look like O'Laughlin Mills photoshoots. Do you know who O'Laughlin is? He says it's complimentary. I need to get up to speed on my photographers.

Speaker 1: Olawn Mills is a portrait offers portrait photography.

Speaker 2: Oh. It does look very nice if you pull up the the Google images on Olawn Mills. Anyway, short summary of the closing. Musk camp, all these open eye executives are rich as hell and lying all the time. Open eye camp, all of that is a sideshow. And literally all the claims Musk is bringing cannot be stood up by actual law. The Microsoft camp disappears into bushes. Dota got mentioned again. They love mentioning Defense of the Ancients. Incredible Photoshop from the OpenAI camp of a calendar of events complete with little characters and a timeline of events. I wonder if they're using ImageGen two or if they're doing it the old fashioned way. I can't wait until it's entered into evidence this afternoon so he can show us. Sort of want to buy this meme guitar but I also have two telecasts. Is that just completely side note? Gamer has entered the blog. The DOTA moment has been mentioned nearly every single day during this three week trial. AI researcher we gotta have Mike back on the show. It's so good. Saw as a true breakthrough in the technology. So Mike Isaacs says I

Speaker 1: played Past four is there what what is the timeline for the jury to meet? Okay. Is this something they're doing today?

Speaker 2: So they're getting a thirty minute recess. Most they've had in a month. I might actually be able go outside and get real food. There's a Popeyes across the street. Is it a bad idea to get a bucket of red beans and rice? That's what he's thinking about doing. So not much news on when this will close. It is 01:10 Pacific time. I imagine that they will wrap up by what did you say? 3PM? 4PM? So thirty minute break. That happened forty minutes ago. So I imagine that

Speaker 1: And but they've taking Fridays off is kind of what I'm getting at. Oh, yeah. Because this could happen.

Speaker 2: So maybe this happens to Monday. This is just closing arguments. It's not necessarily the end of the trial.

Speaker 1: Or the jury might get the results. Or the jury might might make a quick call,

Speaker 2: but Well, there was an

Speaker 1: up date

Speaker 2: eleven minutes ago, a lawyer for OpenAI on Thursday defended the company's chief executive Sam Allman from withering character attacks by Elon Musk's legal team as both sides delivered their closing arguments in a trial with potentially seismic implications. The stakes are high. Mister Musk, was not in the courtroom on Thursday because he was in China with president Trump, is asking for more than a $150,000,000,000 in damages. He is also asking the court to remove mister Altman from the start up's board and to stop a shift the company made last year to operate as a for profit company. They pushed back. Sarah Eddy, member of OpenAI legal team, tried in her closing argument to dull the attacks on Altman's credibility and to argue that there was never a firm agreement among the founders that could have been breached. Not one in this case other than Elon Musk has testified to any commitments or promises that Sam Altman or Greg Brockman or OpenAI made to Mr. Musk is what she's saying. And there is a new update that just dropped in. After the recess, William Saddad, OpenAI's lead counsel, told the jury that Musk does not have a claim against the startup unless there was a specific agreement between Musk and OpenAI describing how his donations to the nonprofit should be spent. That agreement does not exist, Savitt said. So that's where I guess open eye is leaving it for now. We will continue to cover the story

Speaker 1: as Doug says, it is the jury allowed to use codex slash gold? He does in one and a half hours.

Speaker 2: There's other tech problems going on. Max Zaff over at Wired has been covering the story as well and says, Musk's lawyer brought a big monitor, maybe 36 inches into the courtroom. OpenAI's lawyers asked to use it. Musk's lawyer said no. The judge told Musk's lawyers that they have to let OpenAI use it. Then OpenAI said it might not be possible to connect their laptops to it. AGI is here, but we'll still need a dongle, I suppose. A dongle has entered the courtroom, Max. There's about 50 15 lawyers standing in the middle of the room right now talking how to you talking about how to use this big monitor. This is wild. They they should have talked to OpenAI about sharing their monitor. What I always do I always tell you when you come in here, talk to the other side. We don't have the technology available right now, so we don't want to use the TV. We think we should just get rid of it, says the OpenAI lawyer. Sam Walman just walked into the room, by the way. So that happened four hours ago. One of Musk's lawyers carried the big monitor out of the room upside down, wire dragging behind him, defeated. Defeated lie and retreats. That is a very, very funny story.

Speaker 1: In other news

Speaker 2: Bring it down.

Speaker 1: Tim Draper says, I think I broke a record. I took 52 pitches in fifty two minutes at below 40 degrees. Welcome to my office. Hashtag Draper University. Hashtag survival training. What do we think about going in the ice tank?

Speaker 2: How cold are ice baths typically? You you you've done ice ice baths. I feel like I did one and it wasn't as insanely difficult as people said, but then I checked the temperature and I don't think it was 40. I think

Speaker 1: it was closer to 50. You can totally get closer to I I put

Speaker 2: because there's a couple of companies that sell

Speaker 1: Personally, when when if you're Yeah. Going surfing and the water is below 45 degrees, it can just be very painful.

Speaker 2: Okay.

Speaker 1: Like to Yeah. So even in a wet suit.

Speaker 2: Oh, okay. Your fingers go down. Anywhere that's not covered.

Speaker 1: A lot of people are putting gloves on. What do

Speaker 2: you think, Tyler?

Speaker 7: I So apparently, Joe Rogan's at like 34.

Speaker 2: 34. Yeah. Wow.

Speaker 7: So that's like the cold plunge,

Speaker 12: you know.

Speaker 5: He's the

Speaker 11: he's the

Speaker 12: top of

Speaker 5: the mountain when it comes to ice bows.

Speaker 2: He's the final boss.

Speaker 1: This yeah. This is this is just a crazy picture. I did think it was I did think it was AI, but but it turns out it's real. It's just funny because it looks like like what is this set? What is this setup?

Speaker 2: Yeah. What are all the trash bags there? And the wall is like sort of decrepit and there's piping and

Speaker 1: It looks like kind of like a prison ice bath.

Speaker 2: Yeah. This is not what you'd expect from I mean, he a billionaire investor? You'd expect some sort of palatial, you know, you see the the the properties that Mark Zuckerberg is acquiring, that big investors are acquiring. You would expect something that would be much more regal. But he's doing it the old fashioned way, whip this up himself, bought some track bags and took some pitches. Yeah. And you know, who knows? Maybe the next the next founder of Cursor, Figma ramp is

Speaker 1: 52 pitches in 52 is crazy.

Speaker 2: A minute is is crazy fast for a pitch. I mean, we do ten minute interviews, fifteen minute interviews, barely get to the meat of the

Speaker 1: And this one you got four Minutes. Four of the founders.

Speaker 2: Four founders jumping in one minute. That is remarkable. I am not

Speaker 1: No stranger to controversy though. Yeah. Joe Wansell says I am not a humble man but this is legit legitimately beyond my capabilities.

Speaker 2: Absolutely wild. Well, Vercel, Guillermo Rao, a friend of the show is apparently running a an ad campaign on Lyft by buying custom license plates and deploying them through Lyft drivers? Is that what's going on here? No. You think it's random?

Speaker 1: The guy Peter, the driver was like, I must love the Love

Speaker 2: Vercel or worked there or something.

Speaker 12: I don't know.

Speaker 1: If he was the eighth employee at Vercel, I don't think he'd be driving Hopefully not. He just loves truly just loves the game, loves driving.

Speaker 2: Or he's just super illiquid. He just like he's just like

Speaker 6: Hey, please.

Speaker 2: Pay me zero, actually. I'll drive Lyft. Please. I want all equity. I'm super bullish on

Speaker 1: Versailles. That's a possibility. That's

Speaker 2: That's a possibility. Well, Alex Conrad says, is your startup even sponsoring Lyft license plates yet? It's an outside the box strategy. Someone should pick it up. Someone should do it. Get a bunch of license plates for cars. Rent them out to Lyft drivers. Get those impressions. Wix is down a bunch. This seems like a very logical company to suffer in the age of vibe coding. People are vibe coding websites all the time. And Wix is a supplier service to build websites based on templates. But Wix was buying 30% of its shares at $92 six weeks ago, but the stock is now down another 45%. And so I was wondering about this. I almost asked Max Levchin about this yesterday. But when you're going through this world, like it seemed like he was very confident about the SaaSpocalypse and did not feel the need to respond or take any dramatic actions just sort of wait and let the metrics do the talking. But I was wondering about you know are you tempted as a CEO when your stock trades down on a narrative that you know does not apply to you but you're just sort of a collateral damage? Are you tempted to do a quick buyback and just sort of, you know, get a good deal on your stock if it's even if it's just, you know, three months down, then right back

Speaker 1: Imagine being a public company CEO and buying back your stock and then getting a return on it. It has to be one of the most euphoric experiences. Yeah.

Speaker 2: Yeah. Totally.

Speaker 1: Not not not actually getting return, but but obviously, you know, decreasing

Speaker 2: Yeah.

Speaker 1: Or increasing everyone's

Speaker 2: Well, Wix is a 2,900,000,000 company now.

Speaker 1: Yeah. They they acquired this company Base forty four.

Speaker 2: Okay.

Speaker 1: Remember, this was like a one person

Speaker 2: Oh, yeah.

Speaker 1: That's right.

Speaker 6: One

Speaker 1: person company and Five they were growing rev I I think Base forty four has been growing revenue quite quickly. It seems like pretty much any of these five coding tools Yeah. Just the the experience is so magical for people Yeah. That a lot of them have grown revenue Fastenating really stock

Speaker 2: chart if you zoom all the way out. So during COVID twenty twenty one, Zurp era, stock was at $300 a share. It's at 52 today, by the way. It traded down after Zerp error ended all the way to $50 a share, $60 a share. And then post chat GPT moment, 2024, fantastic for the stock. It gets back up all the way to $25,200 a share. But then since 2025 as AI has gotten better at coding, vibe coding websites, doing front end design, there has been a significant sell off that continues today. And so rough goes

Speaker 11: I was looking

Speaker 1: over there. To get a comp. Yeah. I looked up Squarespace. Squarespace is no longer publicly traded.

Speaker 2: Mhmm.

Speaker 1: It was traded on the NICI, but it was delisted after being taken private by at 7,200,000,000. Mhmm. That is tough timing. Taken private in 10/17/2024. Oh, interesting. And at the time Yeah. There was not a sasspocalypse narrative. Yep. You couldn't one shot a beautiful website Yep. With a single prompt. Yep. It's gonna be so hard to for this firm to make money on this deal.

Speaker 2: Yeah. It feels like a new customer problem just because it's not the hot new technology that you're hearing about. Like, the podcast ad conversion has to be a lot worse. But I would be very interested to know what is retention like? Because I know some I people that have built know some people that have built these web website generator companies, and then they just keep growing and growing and just sticking around forever because once someone has the magical experience of building a little website for their company or their personal brand, and then they just let it run forever and they're, $10 a month, I'll just let it keep going. Well Yeah.

Speaker 1: So Squarespace had done around 1,000,000,000 of revenue in 2023. Okay. I'm assuming they grew into 2024. We don't have the full year numbers because it was taken private in q four. But it's pretty reasonable revenue multiple. Yeah. But if they lose out on a lot of those new customers Yeah. Because there's every single Yeah. Company in the world every single company in the world it seems like is trying to make a box that will make you a website.

Speaker 2: Yeah. Yeah. Everyone. Anyway, you know what very few companies are making? A nightstand that turns into a bat and a shield for defense. I like this. It looks so unassuming as a nightstand. Very believable. No one would guess but then something happens. You grab your bat and shield and you're ready to rock. Did you pick one of these up? It has a little bit of a hotel vibe to it. It doesn't Don't. And also, I like a nightstand that

Speaker 1: All I would say is don't bring a nightstand to a gunfight.

Speaker 2: Okay. Yeah. Well, people are having fun with AI generated videos showing that, yes, in fact, it's not if it's not bulletproof, it has a has some trouble. If you disable Ben Thompson says, if you disable Open at login for the Gemini app launcher that the Gemini app installs in the background without asking, Gemini app launch will immediately re enable open at login. I will now, needless to say, delete the Gemini app and don't intend to install it ever again. And so this is very, very odd. Gemini login oh, so it automatically logs in no matter what. He says, I'm I'm actually struggling to remember a bigger middle finger to a user from an app ever. It's bad enough to install a helper app, but to immediately undo the user's explicit setting change? Incredible. And Josh Woodward from Google chimed in and said, this is a bug. It will be fixed in the next release aiming for right after Google IO. More if you're interested. So that's good. They did they did receive the feedback. Well, we should talk about Nikita Beers. I was reminded of this because he screenshotted and posted it. The greatest growth hack of his career for one of his projects. This happened was this a year ago? Gas or Explode app? This about a year and a half ago. Pre joining X and working with Elon Musk over at X, he launched a company called Xplode or an app called Xplode. And he had a very interesting growth hack where he incorporated the company as TapGet Inc. And so in the iPhone app store description under the name of the app Xplode, it would say TapGet and then right below it would say get because it's And a free it doubles down on the call to app.

Speaker 1: He made the entity a call to action.

Speaker 6: It's

Speaker 2: genius. Little these little things really add up and you've seen them all over X and he's done a good job of creating reengaging areas. And I just feel like the UI of X has been improving significantly. I'm really I'm really enjoying the latest UI feature where if you're watching a video and you want to speed it up, can hold on the right side of the screen, which is fairly common in video apps these days. Doesn't work in the iOS native video player. I don't even know if it works on YouTube, maybe. But what's really cool is that if you press and hold it, you will temporarily be in two x speed mode. But now in x, if you press hold and you're in two x speed mode and you drag down, it fills a little circle and keeps you locked in that two x speed mode and it actually changes the speed of the video permanently until you change it back. And so that little delightful touch is something that I'm I'm seeing more and more of from the x team and I'm a big fan of. Well, without further ado, have Andrew Feldman from Cerebras in the waiting room.