Neurosurgeon grows living neurons on electrodes to make AI models faster and cheaper — raises $25M seed

Feb 12, 2026 · Full transcript · This transcript is auto-generated and may contain errors.

Featuring Alexander Ksendzovsky

Thanks. Love it.

Let me tell you about Cisco critical infrastructure for the AI era. Unlock seamless realtime experiences and new value with Cisco. And without further ado, we have Dr. Alex Zendowski from the Biological Computing Company finishing out our Lambda Lightning round.

What's going on?

How you doing, doctor?

What's going on?

It's good. Good to see you. Uh,

lot of whiteboards today.

Yes, I I I'll ask you to explain that shortly, but first give us an introduction on yourself since it's the first time on the show.

Yeah, absolutely. So, uh, my background, I'm a neurosurgeon, a neuroscientist.

I've been growing neurons on electrodes for about 20 years.

Okay.

Uh, I spent the bulk of my career studying how neurons and brains process information.

Okay.

Uh, first in humans by implanting electrodes in the laboratory by growing neurons, as I mentioned, on electrodes.

Yeah.

And Yeah. Now,

really quickly, when you say when you say growing neurons on electrodes, does that look like a petri dish? Is that organic? Is that meat space or is this a digital representation of the neuron?

So, there's no neurons on this. Uh, but this is the dish that we grow them on.

Okay.

Um, you can see it here.

Yeah.

Uh, that little the little box in the middle has about 5,000 electrodes.

Wow.

Um, and yeah, and we grow them on there. They're alive. And we use them to process information as computers.

They're they're biological. These are cells that are all connected to create the neuron. Fantastic. What can you do with it? I feel like a lot of people are pretty happy with their NVIDIA GPUs. What do I need a biological computer for?

Yeah, absolutely. So, first uh first of all, um neurons and brains are extremely more energy efficient in silicon. It's it's well known.

Um and our first application is to help solve this energy this looming energy crisis that's happening.

Um you know, second of all, neurons can change the the way they're connected to each other. So, as you all know, current hardware is rigid. It doesn't change. Yeah.

So, we're all for compute.

Mhm. How much of this is a science project? How long will you be in R&D? What is your timeline? Uh I mean, we're No, we're we're not afraid of science projects here. We talk to folks all the time that are give give us timelines like we're going to be live in 2035 and uh we love that but it feels like you're making some pretty advanced progress. So talk to us about how the business shapes up when you see this going into production what the milestones are.

Yeah. So uh I have like a big note here, sticky note on my monitor. Uh this is not science fiction. This is not research. You know if if I was going to get anything across that would be the thing, right? We're doing this now. We we're deploying it.

Yeah.

Um

so far our products are twofold. One, we're using the biological network to create a software layer that plugs directly into ANN's artificial neural networks now to make them better better fast better faster cheaper.

Okay.

Um so that's currently happening.

In addition, and you asked about what this thing is, this is what we call our algorithm discovery platform. So we're using real brain cells to identify what's coming after the transformers. Yeah.

So the way we do that is we build state-of-the-art models in house.

Um we parameterize them. We ultimately use the biological network to understand neuroscience principles and define the neuroscience principles that can plugged into these state-of-the-art models. And we do this in a loop.

Um and in the end um these plugins improve them and make them better, faster, cheaper.

Do do you have any reflections on the shortcomings of current AI models? Why they differ? There's that sort of famous quote of like you don't need like a like a teenager can just learn to drive a car in like a month of you know a couple hours in the seat and yes we're able to train Whimos but on an energy and time perspective like on an apples to apples basis you're effectively you know doing like a million hours of training and you don't have to give a human a million hours of training to drive a car so what is different about the current structure of AI systems versus the way humans actually learn.

Yeah, absolutely. So, that's a bit of a history lesson. Um, so, you know, going back to the 1950s where we had the first perceptron makitz neuron, that was actually based on how we understood at least at the time how neurons function.

Mhm.

Fast forward to the 1980s, uh, back propagation won the Nobel Prize a couple years ago.

Uh, that's when a divergence happened at that point as back propagation is not a real neuroscience principle. Um and that's when this divergence happened and ultimately since then uh software and hardware became very unbiological in the sense that again it's rigid uh in the way it processes information.

At the same time though interestingly around the 1980s neuroscience flourished

uh we started to be able to grow brain brain cells in a dish. We started to understand what the neural signals coming from brain actually means. This is the basis for a lot of the brain computer interface companies that are currently out there. Um and so three years ago when we started the company we said okay well you know now we're going to take a 2026 approach and understanding of how uh neurons process information and we're going to apply it towards novel AI systems.

How do you uh keep the neurons alive? What do you feed them?

Yeah. So um they live in a media that's got proteins. Um it's got sugar in it. It's it's actually relatively simple.

It's Pete. It's Ray Pete. It's on the Pete diet. It's got sugar. Jordy, you're you're going to prove this. This is great.

Sugar superfood.

Yeah. Yeah. These these these techniques have been kind of perfected over the years. Again, this is one of the things that allows us to do this. We can bl for over a year.

Yeah.

Um Yeah.

Sugar, protein, any any olive oil, any fats. We got We got some Brian Johnson snake oil right here. You could put that in there.

Definitely some olive oil. We'll try it out.

Uh what uh yeah, coming like

Yeah. Fast forward to the end of this year,

what what are you what have you done that uh allows you to feel comfortable taking like you know Christmas off or or New Year's Eve, something like that?

Yeah, that's a great question. Um so customers, collaborations, design partnerships, you know, right now um the adapter product that we have plugs into video generation models. Um and so you know we are um open to working with any foundational model lab that's compute constrained and so um we can make our models more efficient

which is every lab

that's right I mean you know similarly as I mentioned earlier with our algorithm discovery platform we're looking for partnerships with the hyperscalers I mean you know they have research arms that are interested in what's coming after transformers right and so

um we'll take the holidays off uh if you know once we're working with them as well.

Mhm. Well, uh you raised some money. How much did you raise?

I'm very happy to say. So, we raised a $25 million seed.

Seed.

There we go. There we go. That's But what's after a mango seed? What's bigger than a mango seed? Cuz I think of a mango seed as like eight. This is like a watermelon.

Yeah.

Watermelon.

No, watermelon has small seeds. Watermelon, big fruit, small seeds. Doesn't

mango is the biggest seed. It's just it's the end of the road. After that, you just go just go giant AI seed round. Well, congratulations and thank you so much for coming on the show and breaking it down for us.

Yeah, come back uh like a as you guys have breakthroughs and things like that. Come back on. You don't need to have fundraising news to come on and talk about biological computing.

You guys, you guys come to the lab.

Yeah, San Francisco and Mission Bay. You guys are always welcome to come.

Amazing.

See some brains.