Sunday Robotics raises $165M Series B at $1.15B valuation, shifts focus from demos to real home deployments

Mar 16, 2026 · Full transcript · This transcript is auto-generated and may contain errors.

Featuring Tony Zhao

from Sunday Robotics is here. We had to delay. Oh, let's uh start the lambda lightning round. Play that cube. We were overdue for lightning round. We went long on Friday, but we have a lightning round today. And we will start with Tony from Sunday Robotics. Welcome back to the show. Tony, how are you doing?

There he is.

Very good. It's awesome to be back.

Thank you so much for coming back. Congratulations. Extremely busy.

Extremely busy. Incredible progress. Uh take us through the progress. H how are you framing like this the most recent era of Sunday Robotics?

Yeah, I think the the biggest announcement or commitment that we made is that hey, we're ending the era of demos.

Yeah. We're focusing on deployments now. That's amazing. And I think really the what's behind it is that there are there's so many robotic projects that start as a demo and end as a demo.

And that was like unfortunate.

No, no, they start as a demo, they end as a YouTube video that goes viral.

Yeah. So, and I think that like we just from the how much progress we've made uh through the beginning of this year and all the accumulation of infrastructure and and systems that we feel like we can deploy it to real homes this year.

Yeah.

And that's the premise of the whole beta program that we talked about and yeah, and we're just like, hey, that's our sole focus. Uh and we're just going to do it really really well.

Okay. What what types of tasks do you have line of sight to everyday consumers benefiting from with I'm assuming some level of supervision but enough uh autonomy that they can be I'm assuming valuable are you going to sell them initially or are you just going to place them into homes like how are you how are you thinking

for a robot ride a horse and pull me in a chariot behind it.

Yes,

that's what I'm into.

Yes. But what are you working on?

Yeah, 100%. So on the demo uh on the on the beta program, yeah,

uh we are actually going to document it. We're going to be very transparent. We're going to be autonomous as well. And the reason is that we have so much data and the robot will be generalizable.

Okay. Uh but I think at the same time when it comes to tasks that we'll address, I think the fun part is it will not be surprising uh that you look at all the things that you spend most amount of time on like the thing that you hate the most. We're talking about laundry, we're talking about dishes, we're talking about like like organization, cleaning, these type of things and we're just going to pick a focus and do it very well and be able to provide value. Okay.

Uh so that is how we think about it. There will not be like this uh like super surprising pick of tasks.

Yeah. Uh, what do you think about the opportunity in offices versus the home? Everyone's focused on the home, but I feel like an office is potentially like a less it less chaotic environment. People are generally more

Yeah.

like not not leaving, you know, a trail of clothing around or whatever. Like maybe there's more straightforward tasks. Uh, where do you where do you where do you see the divide? Yeah, I think home to us is such a good uh like a long-term goal that to drive the AGI moment for physical intelligence, right? To get there uh because it's so diverse, so many tasks, so much uh like objects, things are moving.

But I think as we approach it, there will be lots of as we build up the capability of the robot, it start to unlock other use cases. Maybe it's in offices, maybe it's in hotels, maybe it's somewhere else that we're actually very open-minded to that and that's something that we are going to think a lot about this year.

What's going on on the data side? You met it sounds like you said you have a lot of data now. Where is that where is that coming from?

Yeah, so I think for folks who haven't uh read our website yet, uh we have this new way of doing data collection which is uh building gloves that are mirroring the robot's hand. So instead of needing to deploy like thousands and thousands of robots, we just need to make all these gloves and people can wear them and collect data in their own homes. So this gives us really high quality data but also really high diversity and quantity of data.

Um so I think this year we're going to scale to like a few thousands of these people to be collecting data for us every day

and we're going to build a high quality and diverse data set that will be kind of the powering the foundational model that we're going to train. Is there a value in having um a less transferable, less precise data set with higher volume uh maybe recorded through uh like a face camera, like a metaray bands? I've seen some examples, I think it was in the LA Times today about uh people doing chores with basically a GoPro on their head. Uh and they're just recording what they're doing while they're doing chores. Uh, and it feels like maybe that's not the perfect data, but if you can transfer that data over to the gloves and they can transfer that to the robot, maybe you get extra data, but what's your thought on like the continuum of data quality?

Yeah, like I think the you're talking about like egocentric cameras, right? That people strap a camera here uh to record their movements.

Uh, I think the if you think about the quality side, we're definitely compromising. For example, we do not have precise movements of how people use your hands. We do not have force information, tactile information, these type of things. So just that data will not bring us all the way through. But at the same time, egocentric data and all the data we already have in in the public domain in on the internet is going to help the robots, right? Because you can learn the more general physics, you can learn some like intuition around how like rooms are arranged like all those common sense. So I think the eventual recipe will be a combination of those uh video uh public data with our proprietary data sets. And the way we think about it is that like we're going to use our data to bridge uh this bulk of knowledge that we can extract from the internet to be a deployable product that is actually useful. Kind of bridge the gap from the gap from like demos to something real that's providing value.

Yeah.

How do how do you incentivize people to to wear the gloves?

We pay them.

No, I know. But like is it is it is are you paying them like

Good question.

Good good good answer. Um, no, but I but I'm curious like are you you're giving somebody the gloves and saying like, "Hey, I want you to do like at least an hour of activity a day." Like like

is it per task? Like what's the structure?

Yeah.

If they just if they just put them on and then watch Netflix, I can't imagine that's that's valuable.

100%. uh like I think we we both need to uh give uh requirements on like the the quantity of data and the quality of data and everything else.

Um but I think it's actually a really good u part-time job to have

that you can collect the data anywhere like like in your home and you can do it anytime. You can do it right like super early in the morning. You can do super late at night like in between uh your your shifts, whatever it is. Uh we're going to be really happy about that. And you don't need to like even leave your homes.

Yeah,

it's cool.

How are you feeling about simulated data?

Uh we've talked about the sim toreal gap before and there's always this, you know, there's not enough variation in some Unreal Engine environment that you build a kinematic model in. But it feels like with generative AI, you should be able to sort of stochastically generate different variations, create better synthetic data. It feels like the LLM companies are doing very well with synthetic data generation in certain cases, the various rollouts. Um, like how are you feeling about it? Do you think it's going to be in the playbook this year, maybe for a few years and then not anymore, or it's something that would be valuable farther out and and and maintain from there? How are you thinking about synthetic data?

Definitely. I think we talk about like world models a lot these days, right?

Yeah.

And they're like, hey, can we generate synthetic data out of the world models? Like how good is that? And I think there are there are two sides of it. One is training a world model can allows us to leverage even more compute and even more uh data like all the internet right uh that can bring us a lot of knowledge uh like like without collecting any additional data.

Mh.

Um but at the same time um I think it is neither going to bridge the deployment gap which is means like getting from 95% to 99.99%. Mhm.

More is going to bridge the last millimeter for a certain manipulation task because you're just certain like the fidelity of the data is slightly worse.

Sure.

Uh but what I see that is like being a layer that lifts like everyone.

Yeah.

Like everyone will be become better. Oh, sure. Because they now pre-train on all these uh synthetic data.

Yeah.

That makes a lot of sense.

Uh you raised any money lately?

Yeah, we we actually did it mostly last December, but

Oh, yeah. Uh we raised a uh 165 million series B uh led by CO2.

What was the valuation you said?

1.15 million.

Wow. Unicorn already. I love it.

Uh well, great set of investors. Congratulations. We love CO2 over here and uh they know what they're doing. So, good luck.

Send us a pair of gloves, too.

Yeah.

You don't even need to pay us. We'll just we'll continue.

How else will you know how to adjust a podcast mic 75 times over the course of 3 hours? You can't solve that. You can't see.

Well, thank

I'm not afraid to be automated by Sunday.

I invite I invite it.

Challenge accepted. Uh, have a great rest of your day. We'll talk to you soon, Tony.

Good to see you, Tony.

Goodbye. Cheers.

Let me tell you about Lambda. We are, of course, in the Lambda Lightning round.