Normal Computing raises $50M from Samsung to build thermodynamic AI chips targeting the data center energy crisis
Mar 27, 2026 · Full transcript · This transcript is auto-generated and may contain errors.
Featuring Faris Sbahi
Speaker 1: Faris, how are you doing?
Speaker 2: Hey, guys. Thanks for having me on.
Speaker 1: Thanks so much for Good jumping us to see you. Please introduce yourself and the company.
Speaker 3: Hey. I'm Faris, founder and CEO of Normal Computing. Mhmm. So Normal is a company that builds AI software with the world's largest semiconductor companies. So right now, more than half of the top 10 by revenue semiconductor companies use Normal to design custom silicon. Mhmm. And we're also using that same capability right now to bring new kinds of chips and ASICs to market ourselves, mostly focused on energy. So hyper efficient, ultra efficient ASICs.
Speaker 1: Okay.
Speaker 2: Seems somewhat abnormal given your stage to have this level of traction, but continued.
Speaker 1: Yeah. Where did the name come from?
Speaker 3: It's a couple of things. So one is, you know, we're right now, the ASICs targeting different kinds of probabilistic machine learning models
Speaker 8: Sure.
Speaker 3: Like diffusion models. So normal is kind of an ode to the normal distribution. But also, there's this kind of broader view that this is really the way that computing should be. Like, this is the more natural way to process and run, you know, AI inference and AI workloads in general.
Speaker 1: Yeah. Yeah. Yeah. Tell us about the the the chip landscape where you see the most activity. There's a lot of news from Arm this week around CPUs. Obviously, everyone's familiar with the with the GPU. And then the between the TPU and Cerebras and Grok, there's, like, a few other ASICs that are sort of popular, but it feels like there's a much longer tail that you're going after.
Speaker 3: For sure. I mean, so obviously, we had GTC last week. Sure. And Jensen put forward what I think is a really nice equation where he's saying that revenue equals tokens per watt times total available gigawatts. Mhmm. So what that means is it's there there's two main problems that we should be focused on right now from our perspective in chips. One is increasing the amount of energy that's useful and usable by us and and for us. But the other is improving the energy efficiency that we have in the silicon that we're using. And so there's a few trends. One of those, I would say, relates to the the news that ARM put forward. They have their own chip, you know, real hardware for the first time in forty years. Yeah. Historically, an IP licensing business. That's part of this broader trend to, you know, custom silicon and the data center moving to heterogeneous computing, which means that in the future, not that many years from now, rather than having a very small number of different kinds of chips that are running all of our workloads, We're gonna have, you know, hundreds, if not thousands, of different kinds of chips that are tailored to each workload and application in the data center.
Speaker 1: Are thermodynamic computing chips underrated?
Speaker 3: I think so. Yeah. What do you think? What's
Speaker 1: your perspective? I heard that much about them. I I I don't know. And it feels like it's very early stage. And so it Yeah. I've seen some demos, but it's very unclear at the demo phase how much is being done on the actual chip, what the value is. And and we went through this with Grok and Cerebras where there was a there was very early pushback around, oh, well, like, you'll never be able to run this type of model on that chip, or it will you'll need to network a thousand chips together to do anything valuable. And I've I've been surprised, and I think we've seen some narrative violations around Cerberus and Grok. I've used I've used Codex 4.5 Spark on Cerberus, and I know it's good. I know it works. Yeah. We had a founder come on that baked llama onto a chip and made it available at chatjimmy.ai. And like, you can just Yeah. Feel the You can just feel the value. I'm wondering like are should I think about thermodynamic computing just as, another speed versus size or cost trade off or if it unlocks a different a different, like, vector of optimization?
Speaker 3: It is a different vector. So right now, I mean, the two typical vectors that you have, mean, there's three if you include area. But you have speed and energy. Those are typically the trade offs that you're making. But we're introducing a new one, is noise.
Speaker 1: Okay.
Speaker 3: So for these kinds of devices, you can't necessarily run like, can't run a calculator or something where you need full precision Sure. Like deterministic calculations. But let's say that you're trying to run something like a diffusion model. Yeah. Right? Which I think is super topical, right, because OpenAI just winded down Sora this week. It was costing them like $15,000,000 a day to run, 2,100,000.0 cumulative revenue. But that's that's the kind of workload that's a really nice fit for this computing paradigm
Speaker 1: Yeah.
Speaker 3: Because you're taking something that's noisy and approximate by definition Yeah. And mapping it to hardware where the physics maps, like, really nicely between those equations.
Speaker 1: You think that's five years away? Ten years away?
Speaker 3: We think less. Yeah. We think less right now. Less. So our yeah. And I think it has to be. So, like, from our perspective, 2030 is a is a key date that we talk about a lot internally because there's a view that, you know, even by 2028, we're gonna have, like, a 49 gigawatt shortfall in terms of power. And so something something has to change, something has to give. It's it could be energy. It could be that we find, like, new sources or, you know, data centers in space or one of these other directions or it's we have a real breakthrough when it comes to silicon. That's kind of what we're going after.
Speaker 1: That'd be amazing. Tell us about the round. I wanna hit the gong.
Speaker 3: Oh, awesome. Yeah. I have my own gong behind me too. I don't know you saw that.
Speaker 2: No way. Just for the
Speaker 3: it's a $50,000,000 round. We're calling
Speaker 2: it accelerator round led by Samsung.
Speaker 1: Thank you.
Speaker 2: Hit the gong. Led by led by who? Sorry. Missed it.
Speaker 4: Samsung.
Speaker 1: There you go. That's
Speaker 2: And what were you doing before this?
Speaker 3: I was at Google. I was at Google brand and then Google acts afterwards where I met my founders, my co founders.
Speaker 2: There you go. Somewhat of a nontraditional. Yeah. Great idea.
Speaker 1: Like the perfect perfect team for this. Well, congratulations. Very excited. And, yeah, come back on the show. We'll talk to you soon.
Speaker 2: Actually, multiple times this year.
Speaker 1: Yeah. Have a good one.
Speaker 2: Take care. Care.
Speaker 1: Let me tell you about Okta. Okta helps you assign every AI agent a trusted identity so you get the power of AI without the risk. Secure every agent. Secure any agent. And let me tell you about Cognition.