Ben Koska on SF Tensor: GPU orchestration for AI training across clouds, $41K revenue in 2 weeks post-launch
Dec 3, 2025 · Full transcript · This transcript is auto-generated and may contain errors.
Featuring Ben Koska
AI apps. Uh anyway, it's a fascinating it's a fascinating chart. I'm sure we'll be digging into it more, reading the tea leaves. But up next we have Ben from SF Tensor. It's Versel for GPUs. Welcome to the show. Thank you so much. Please introduce yourself and the company.
Great to have you.
Hi. Yeah, thanks. I'm I'm Ben. Uh we're building the infrastructure layer for AI researchers. So basically from you know training models from like small experiments all the way up to large scale frontier training runs. We basically deal with the infrastructure to allow you to do all your training runs.
Okay. So there's a bunch of different layers going down to somebody that owns the ground, somebody that builds the data center, somebody that racks the GPUs, and there's then there's the Neoclouds. Are you interfacing with multiple Neoclouds? Are you a Neocloud? How are you positioning yourself?
Yeah, so we we work with all sorts of Neoclouds and hyperscalers and we basically just say we're building above all of them. And so our customers should only be worrying about what they want to be researching or training and not like how the actual technology like the underlying stuff works. And so we deal with you know finding GPU allocations for different GPUs. So we also allow you to work with TPUs or AMD GPUs or any of this stuff to allow you to train your models.
Okay. So this is specifically for research and training runs and you're and and less uh focus on on like actually inferencing on the product side.
Yeah. So we focus exclusively on the on the training side. There's great companies even you know from last batch for example there's luminal they do great things for inference uh we focus just on training because we think training is a problem that's not been solved by anyone and there needs to be way more training happening
uh what are your clients uh like what's the shape of them I guess there's a lot of focus when when people think training they think open AI anthropic Google deep mind right but take me through the variety the landscape ape of folks that you talk to who are actually doing training runs. Who are these folks? You don't have to give exact names, but tell me the the shape of their workloads, how they're what problems they're trying to solve, the scale of their training runs. Take me on a little tour.
Yeah. So, there's there's a huge variety. I mean, you have on the one hand, you obviously have like the academic, you know, or home researchers at home who are training like small models and then you have, you know, larger scale academic research happening. But then you also have startups that have raised maybe you know call it $10 million. You know there's some companies from from YC as well who are training models for super niche use cases. And then there's also you know companies that have raised hundreds of millions or you know up to a billion dollars. There's a bunch of labs actually in that like area who are doing who are training their own models. You don't just have anthropic. I mean like
the the textbased models like LLMs there's not an awful lot of competition going on there anymore. Things have sort of converged at the top there. But for everything else like you know drug discovery or um you know protein folding all of these things are still problems that have not been solved by anyone.
Is it correct to say that SF Tensor is a bet that there will be millions of of of uh smaller models for specific use cases or or one day billions?
I wouldn't say billions but definitely a lot more than there is today especially just in the modalities that haven't been explored today. I mean, we're all focusing on text and text is great for a lot of things, but I can't really use a textbased model to do things like, you know, um text to speech, for example, is another type of model or we have protein folding models or all of these things can't really be solved with text. We need models that are specialized in those pockets. What what about uh I mean we were talking to the CEO of AWS yesterday and he was saying that uh AWS launched a product that that is actually a checkpoint uh 80% of the way done on an actual foundation model and then a company can come in and add their own data to the pre-train and then they can do everything else with it. Uh, and that felt like an interesting proposition when you think about if you do want a textbased model and you want it to be to really know your company's data at the core in the pre-train, really know it, not just drop it in the prompt, not just fine-tune on it, actually bake it in. Uh, that feels like we're going to see a Cambrian explosion of every company wants their own trained model earlier. They're going to want training workloads for that. Is that something that you think you can play in? is are there already other companies that are working there? How do you think about that?
So, it's a very unexplored area so far. The idea of basically saying you have like you know 80% of the way the model can already form coherent sentences have basic reasoning abilities and then I add my own information. I think that's going to be very important in the future just because it allows me to take a base model and then not just do like post-training but sort of you know continuous pre-training almost you know continuing the pre-training I think there's going to be a lot of use cases that come out of that and I think we can we can help there I mean we don't really care what you're training on the hardware we you know if it's if it's an AI training um we can we can help with that so you know that's definitely something we're looking into.
You want to ask about progress?
Yeah. What kind of metrics were were you sharing today uh during demo day?
Yeah. So, the metric we're sharing is we launched like two weeks ago and we did $41,000 in usage based revenue since then.
There we go. Love it. Uh and how's the round going?
We we closed the first day of of fundraising. So,
first day of fundraising.
There we go.
There you go. I'm not going to I'm not going to dox but a a friend of ours.
We got a text message about you.
We got a text message about you. A friend of ours just backed one company this batch and he's known for backing great companies and he just backed you. Uh so I'm excited for for you guys to announce the round soon. Uh and come back on and do it on TBPN.
Thank you so much.
Awesome. Great to meet you.
We'll talk to you soon.
Cheers.
Have a good to meet you.
Good to meet you.
Let me tell you about getbasel.com. shop over 26,500 luxury watches.
Super intelligence for your wrist.
Fully authenticated in-house by Bezel's team of experts. Um Brad Gersonner on Trump accounts pus was elected on main on a main street agenda to get the rest of America into the game. And that's exactly what this does. Bill Gurley showing him some respect. Uh and we didn't cover it yesterday, but Michael Dell uh donated $6.5 billion to these Trump accounts. the uh the accounts where children get them, they can't be touched. They're invested and they compound over
$50 for a bunch of
Yes.
individuals.
And and and there was some push back. Some people were saying, "Well, you if you compound at the S&P, even if you compounded 10% for 20 years, like it's only a thousand bucks or a couple thousand bucks. It's not that much money. It's not life-changing, but you know, it's like a piece of it's one that's that's just Dell's contribution. Like there's going to be other people that are contributing corporations 000 from America
and and Yeah. Yeah. And there's a whole bunch of other ways to add money to the account over time at birthdays and Christmas and stuff targeted donations
and the most important thing is that it's a lock box. So it's psychologically a lock box. So I still stand with the uh