Brett Adcock launches Harc AI lab: 50+ hires, a datacenter of Blackwells, and the ex-Apple iPhone designer building next-gen AI hardware

Mar 30, 2026 · Full transcript · This transcript is auto-generated and may contain errors.

Featuring Brett Adcock

handle your customer support, go to Finn.ai.

I would love to I would love to sit in on that that pitch meeting. the we're building an infinite money glitch.

It it seemed like that

$49 a month.

I mean, I don't know. Like there there's a world there's, you know, Teespring was, you know, empowering entrepreneurs to sell a lot of t-shirts. There's a lot of different things. Uh depends on what what what you bring to the platform, I suppose. Uh well, without further ado, we have Brett Acock in the Restream Radio. Let's bring him into TV Ultra.

Brett, how are you doing,

guys? Good to see you again.

Good to see you again. been far too long. But since the last time we talked, you launched a new company. So, break it down for us.

Oh, hark.

Yes.

Yeah. Let me uh

I want to know about that.

Okay. Well, I mean, I guess the summary here is I've been working for the last three years on I think maybe one of the hardest AI problems in the world, getting AI to work on humanoid robots.

Yeah. separately, you know, separately I've been like watch I'm basically using and watching what's happening in the digital world like you know the the different language models and I to be honest I think they're just incredibly dumb.

Like I I they don't remember anything about me. It's not very personal.

They can't listen or talk to me really well. Can't see the world. Can't use computers well. I just I think this whole experience is just like

I think it should feel very much like a sci-fi movie. This should feel like Jarvis that can like really understand you very personalized use tools well. So about seven months ago I started a new AI lab called HARK

and we want to build really advanced personalized intelligence. Uh in order to get there we think there's some fundamental gaps remaining in in the models. So we basically have a we basically have like a large focus on trying to like basically build new multimodal models. And the second thing is um you know we're interacting with AI today through like 20-year-old computers

like my phone and like laptop these are all like decades old.

Mhm.

And we feel very strongly that there's like a next generation of AI devices um that need to be like need to be built to kind of interface with AGI uh appropriately. So we have a team dedicated not only to models here but also on the design side. one of our key guys uh Abidor uh started about four months ago. Previously led design for MacBook, MacBook Air, iPhone 13, 15, 16, 17. Was keynote for iPhone 17 air about 5 months ago. Uh so ABS is here with a killer team on the hardware side and we're designing next generation interfaces uh for the models that we're that we're working on here in

Is it the interface that you think is the issue or do you need more compute locally?

I think there's like some big gaps in the model side. I I think there's like twofold. I think there's some large gaps remaining on the model development side that we want to try to close. And I think secondly, there's like I think the just the interface of how we're using traditional computers right now to interface this AI is extremely broken.

Uh we think both need to be fixed to have like a really killer uh like uh like you know like super intelligent personal assistant. We think you need to fix the hardware interface and we also think we need to fix the model side. I mean there's just like simple things today that we need to be better like computer use agents today are just not very good. like uh they're getting better every month, but there's still like a large gap in order to get there. Speech to speech systems, which will be like a really natural UI into AGI, are just not not great.

They don't remember things I've I've told it. They don't have access to my life. They can't access my calendar. They're not very like they're pretty high latency. EQ and naturalness are not great. So we're um kind of taking this holistic approach to this problem and saying we we have to work on the models and we have to like fix the interface uh issue here today. Uh what is what is the hiring market right now for all this all this talent because uh you're basically going up against

Apple OpenAI quad you got Demis like you're going up against you're you've already bit off bit off a lot obviously with with figure and I'm I we can move over and get the update there. Uh, but I'm I'm just so curious when you're when you're recruiting talent for HARK, uh, I have to imagine any any of the any of the people that that you're hiring, if you want to hire them, they probably have the opportunity to work at these other companies. So, what's what's working on that side?

I mean, I think the summary is like all the other companies are kind of boring. They're all doing the same thing.

Um, they like they're like all copying each other. We've like headed certain direction last three years. I think that direction is like somewhat saturating. Um to work on like vision understanding to working on like models that can go and interact with the world and get that interaction data we think is like these areas are especially important to push the boundaries and get the AGI uh this AGI feeling of like highly multimodal uh scenarios. Um so we're finding like like from a from a hiring perspective are being extremely competitive. We've brought on now over 50 people into the team. About twothirds of that from the AI uh AI side from like top Frontier Labs. Um I will say it's probably one of the most competitive areas I've I've hired for in general around compensation. The space is just completely lit up. Like I've never seen like it before. You know, I've like hired people across all areas of robotics and AI and

uh software and hardware. Just like this is it's it's next level competitive. I think we have a very small amount of people in the world that really understand how to build the right infra pre-training data mix like all of this is just like very tough. So, and all the some of these spaces are just new like computer using agents that can really reason well in pixel space like this is just happening now. So, there's not a lot of good precedent uh for out for how to go out and build these systems. But

why why not uh what was the decision-m process around doing this externally? because I feel like a lot of the capabilities that you want to build with HARK, like I'm assuming you'll want to integrate into Figure if Figure is going to be a a a robot that can add value to my day-to-day life. I I you know, where where is the where's the overlap and and and why why build it externally?

I'm a big fan of focus. I feel like uh figure we have a singular focus which is like how do we solve for a general purpose humanoid robot? How do we build like a human in a bodysuit that has like common sense reasoning? A lot of the AI focus we have around figure is basically how do we predict physics around things like grab and touch and move through the world. At Hark, we have like a different objective. We want to launch like like next generation consumer electronics and we want to basically build extremely multimodal models that can almost like act like as like a Jarvis type interface to AI

and the the the focus on those tracks are completely different and um with that said though I think there's like some some opportunity over time to closely collaborate. The voice on the model on the robot today is using the HAR voice API. M

so if you talk to any of our robots here today it's using the heart of voice model that we designed here internally. Um so I think there's like a a lot of room over time to collaborate the business together. We're both uh we're like uh we're taking an entire data center of B200s in April here and uh figure literally has half the building and Hark has the other half the building. uh obviously paying for things separately but we like literally between the two of us have an entire data center of like next generation black wells that we're using for training for AI models.

Uh I want your latest timelines on the chatt moment for uh robotics humanoid robotics. We were talking to um Sean Magcguire about this. He he he was putting in maybe like two to three years away, three to four years uh from just you know seeing them on the streets, seeing them in in restaurants, seeing them in the real world. Uh maybe not economic impact because that could happen in in you know all sorts of different industrial uh settings but uh it will be a special moment I think when people uh wind up interacting with a humanoid robot. What are you thinking these days? You can come to figure right now and you can see robots running complete 247 shifts.

Yeah.

Fully autonomously with neural nets all the way down the stack.

That's amazing.

So like I think um this will be a big year for us to ship robots commercially to many different customers of ours.

Yeah.

And then we're also working on trying to how do we integrate these into the home?

Yeah.

How do we get robots that go and do like laundry and dishes and tidy the house? like things that we just like I don't want to be doing. Nobody really wants to do and use like robotics as a key tool for this. Um I think we're I I think we're we're like having this moment now. I think it's uh we're in it. We're feeling it as like like right now like we're seeing these robots do uh long horizon autonomous work at Figure here and I think over like this year and next year it's going to be very the whole world's going to wake up to it. Uh I think we saw a little bit of that at the White House last week with uh with Figure. It was just like we saw like unprecedented demand like uh like uh it was um

it was kind of crazy because we didn't show any like new capabilities. So we're like uh internally we're like okay we're just going to be at the White House. And it was a big it was a big um it was a big milestone like the first humanoid robot you know built in the US at the White House in history. So it was like getting like you know getting getting the invite from the White House to come there and be able to be the first one to do it was just like huge. Um, it was great like but like you know there was there was no new capabilities there but like the the whole world is just like waking up to the moment now of humanoids and it was very apparent from last week at all the the incredible reaction we got from like basically the entire world that uh this is we're still early here in the cycle.

I love it. Well, good luck and thank you for taking the time to come chat with us.

One quick question. Can humanoids reliably crack open a cold Diet Coke and serve it or is like I imagine the tab is like kind of a challenge. That's AGI for me.

I don't think they should have any problem doing this.

We can uh

That's a demo. That's a demo we'd love to see because honestly, John would would buy

a humanoid just to just to come over with a cold one, crack it open. He goes through a lot of these so we can get some real utility out of it.