Marc Boroditsky on Nebius closing a $2B+ deal with Microsoft for AI cloud infrastructure

Sep 11, 2025 · Full transcript · This transcript is auto-generated and may contain errors.

Featuring Marc Boroditsky

day before. The chief revenue officer Mark. He's the guy that did the deal who uh certainly would have played a part in this new deal with Mike. We're very excited to bring in Mark from the Reream waiting room. Let's bring him in. Mark, how are you doing? Hey Johnny, fantastic to be here today.

Thanks so much to have you. You uh you look you look like you're in a pretty good mood. is any any reason any good news happened recently? Oh man, it's been an exciting week and I think you already called it out in the wind up to my showing up. It it's extraordinary what's going on at Nebas today.

Uh give us the give us the high level couple bullet points on on the deal this week. Well, we announced a um landmark partnership with Microsoft um one of the leading AI labs that is out there to support them with AI infrastructure under a four a 5-year agreement uh valued at uh at least 17.

4 billion with potential tag and 19. 4 billion. It's fantastic. Did you think you could get a deal like that done when you joined? I mean, I have to imagine that you you joined roughly six months ago, was it? I joined a little over three months ago. So, maybe the deal was in the works, but it sounds like Wait, yeah.

Did they see Satcha Nadella on the Dark Cesh Patel podcast say, "I'm happy to be a leaser. " And you said, "Oh, we'll we'll lease you. " Was that was that exactly how it played out or I imagine probably conversation.

I think it's a little more a little more subtle than that, but because he really did hold up his hand and say, "Hey, come pitch me. " and people did and and you got the deal done. Congrats. It's it's extraordinary to have them as a partner.

I mean, Microsoft is by itself an amazing company and uh for us to be winning the the the honor is landmark for the company and it's helping us to accelerate all of our plans. Fantastic. Talk about the collection behind you. The chat's calling it out. Crazy technic. That's those are Legos, man.

That's a that's a a Ferrari, Lamborghini, and a McLaren. And uh holy trinity almost. That's the trinity over there. There's another trinity over there of Land Rover and G G Wagon. I love it. I love making the Lego cars. Those are fantastic. Uh give us give us the state of the the company.

We've we've covered we've covered Nebus on the show before. We know we know kind of the the backstory, but uh yeah, talk talk about kind of um the kind of more recent history. Um, and uh, you know, I'm I'm sure you you got up to speed as you joined and and first learned about the company.

Well, the company is very laser focused on building the AI infrastructure of choice for startups all the way through enterprises and we are making big strides against that uh, that vision. Uh, we continue to support the most innovative startups.

As a matter of fact, one of uh our customers was on your podcast today, um Higsfield, which is no way amazing company. Amazing company and very privileged to have them as a customer. That's um likewise all the way up to major enterprises like uh Cloudflare and and Shopify. Wow.

And in all cases, we're helping them to not only service the near-term model creation or early development works that they have, but also helping them to scale their applications and uh deliver the inference that they need. So, it's a it's a very exciting time. That's fantastic. What uh yeah, go going forward.

Uh you you set the bar pretty high here with this recent deal with Microsoft.

What what what's kind of the focus over the next uh you know, like how I guess I'm curious to understand, you know, with the with uh the news around Oracle and OpenAI this week, a lot of people are uh a lot of people were just kind of questioning like, okay, you're going to build something that's like AWS scale in a in a very short amount of time here.

That seems pretty challenging. How much um is a lot of the effort at at Nebus really around just like scaling infrastructure and knowing that the demand is there but it's just like you have to you have to build that capacity.

Well, we've always had the vision and Arrotti our CEO has said it for a long time that we would do some of these super lab deals. Mhm.

Um today's market it happens to be a fantastic opportunity but we believe in the fullness of time the real opportunity is more along the lines of what you described which is delivering the AWS equivalent for AI and that's where the priority is right now focused.

So, continuing to build out our software stack, being able to cater to that, you know, single engineer that's trying to build something uh from scratch that um signs up with our self-service experience and then being there with them as they scale and their needs expand.

At the same time, we're pursuing all of the other highscale um AI uh consuming customers out there.

And you know we think that the next several years are going to be a crazy time uh as capacity uh supply is being chased by a lot of different people but ultimately the long-term you know the hundred billion dollar plus business opportunity is servicing uh the equivalent of the hyperscaler requirements for the new class of AI customer.

friend of the show, Dylan Patel, over at Semi analysis gave you guys a gold medal in the cluster max ranking of uh GPU clouds. Um, give us one layer deeper in terms of how you frame what Nebius does best uniquely among the competitive landscape.

uh you know semi analysis called you gold but I'm sure there's a lot more details of how you stand out. Well, we're delivering the software tier that's giving the AI engineer the tools for them to um operate an a a small or large cluster or multiple clusters as they see fit and we're seeing more and more of that.

We give the AI engineer the tooling to create a model and optimize a model and we put at their fingertips all the open source models. Um we give the AI engineer the ability to conduct their training and retraining uh capabilities and through our AI studio solutions we also give them the ability to deliver inference.

So you can think of it as a full spectrum set of AI capabilities for the AI engineer. Um we're going to continue to innovate and add to those capabilities.

So you can think about all the adjacent spaces uh that the AI engineer is likely to need to tap from data pipelines to repositories to um who knows what comes next but our intention is to be fully catering to that AI engineer and being able to serve them whether they're in a small company or a large organization. Yeah.

How does the future of the uh the actual inference API business look? Is that the the the is that more of a focus for growth versus um going more of the construction and real estate route and and building mega projects versus creating like a fantastic API? Do you want to do both?

What does the next couple years look like? I mean the simple answer is we're going to do both. Okay. And the reality is at the the foundation is we're chasing demand. Yeah. Uh so today there is a fair amount of construction taking place.

So being there for all of these new model development and retraining requirements is critical. Um as those projects and customers move into you know real production inference is picking up and we're seeing it. The redistribution as people succeed is very real.

And the fullness of time what we should all want and this is in thinking about it from the standpoint is end customers or investors is we want to see all of these projects turned into inference driven because that means that they're actually in production being utilized by hopefully commercially paying uh customers somewhere.

Makes sense. Real value. Well, congratulations on the deal. Thanks so much for Yeah. Great to great to catch up Mark. Congrats on Pleasure to be here today. Thank you both. Thank you so much. Cheers. We'll talk to you soon. Have a great rest. We got to get more We got to get more Legos around the studio. That was uh I