Nvidia's Dion Harris previews GTC: $3–4T AI infrastructure buildout, GPU-accelerated Oracle databases, and AI factories everywhere

Oct 24, 2025 · Full transcript · This transcript is auto-generated and may contain errors.

Featuring Dion Harris

42 minutes. Wow. That's And I got to the gym. I got to the gym 30 at least 30 minutes before you. My drive is twice as long. This whole week for me. I was slacking. But our next guest is from Nvidia. Dion Harris is here. Welcome to the show. Down for us. How are you doing? And looking incredibly sharp.

You look fantastic. Thanks for having me. I didn't want to make you guys look bad, so I I even threw on the jacket. Oh, yeah. By the way, down a little bit. Thanks. So great to have you on. Um, great to meet you. Would love to get a little bit of an introduction on yourself and where you sit in Nvidia.

It's uh it's not the smallest company in the world. It is in fact the largest company in the world. uh but but uh how how do you uh uh how do you position yourself within the firm? What's top of mind for you today? Sure. Thanks. Thanks for having me again.

So, Dion Harris, I'm the senior director of our HBC AI and cloud solutions group. So, not a very um imaginative marketing name, which means I run a lot of our HBC scientific computing national labs.

Obviously, we're here in DC, so you know, lots of engagement supporting a lot of our national labs both here in the US and across the world.

And of course AI and cloud is it speaks for itself in terms of a lot of our hypers scale clients as well as the cloud cloud providers who are building and delivering on our platform. So engaging with a lot of those those folks partners customers and internally obviously to bring those solutions to market. Yeah.

How how big do you have like a headline number that you're thinking about broadly for uh just AI infrastructure? It's the the hottest topic these days. How how do you think about forecasting AI infrastructure and where we'll be in a few years?

Well, you know, I think Jensen has been on record and I think it's a pretty sound um sort of estimate. We we expect it to be about three to four trillion dollars in in AI infrastructure and Yeah. Yeah. It's a big number. It's it's with a T.

So, and and I think what's really driving that um and I've been listening to the show and here, you know, there there's a ton of demand for how and where AI can be put to use.

Of course, we're familiar with the chat bots that we knew, you know, use and and to speed up our email um communications and to, you know, review and spellch check. Those are great use cases, but what's really exciting is when you see it you being used to detect new drugs.

And so there's new use cases that are happening in drug discovery all the time. Um I do a ton of work with national labs and seeing how it's being applied to climate and weather.

um being able to to replace a lot of our old numericalbased um approaches and so using AI to go and tackle really big problems that we all care about. Even if we don't see and use those applications every day, we're impacted by them, right?

And so what's really driving that number is like I said also looking at how AI is being brought to industries. Um you're going to hear a lot about next week when we when we announce the GTC how we're helping industry adopt AI.

And this is really important not just in terms of the bottom line of US companies, but really making the US competitive in a global scale, right? Making sure that we can deploy and run and build and manufacture at an efficiency that makes us competitive on a global scale.

And so that's really going to be a lot of what we talk about here at GTC next week. Do you have do you have one of the easier jobs in tech ju just because if I'm a data center operator, I have to answer questions about well am I using natural gas or clean energy?

And if I'm an inference model creator, I have to answer questions about are the answers truthful or are they fair or am I generating something that's valuable or am I generating slop? But Nvidia, I feel like, has the perfect, it's just the perfect business because it's you're already optimizing for the most efficiency.

So, no one can say, "Oh, well, you're you're not taking energy efficiency seriously. " Like, that's the entire point. And no one can say you're unprofitable. And no one can say you're unprofitable. So, like, how are are you feeling good right now? Are you feeling are you feeling happy? I don't know. Yeah.

What's the energy like even even internally at NVIDIA? I'm curious. It's got to be such a wild environment. It's great.

it it's you know definitely you know incredibly crazy right now but what I would say is just because there's so much interest and excitement around um not just AI but accelerated computing a lot of what we do you know we're an accelerated computing company at heart and there's you know over a trillion dollars of infrastructure and still moving from CPU to GP GPUs in fact Oracle just last week announced that they're going to be accelerating their their um classic database now that's most people don't think about databases crazy on on you know, and they're backending every major application.

And so, you know, when we think about NVIDIA, you know, obviously accelerated computing is how we've really um got started. And now that we're in this era of AI, you know, we're really looking to help power every application across multiple industries.

But to answer your question, um we wake up thinking about how can we make our platform better and how can we make it more efficient because that's really at the core of the value for our customers. But we're actually doing a ton of work beyond just our core platforms.

In fact, our solutions are shaping and redefining how the data center overall is built. And so, a lot of the work that we're doing is building solutions and blueprints and reference architectures that help the entire ecosystem get ready for what we're building, which delivers more efficiency at the at the end state.

So, again, I think our position is unique in the marketplace and that we totally see and understand all the new new models that are being developed and coming coming to market. We obviously understand what our products are going to be able to do today and going forward.

And we therefore are giving lots of insight into all the mechanical, electrical, and plumbing infrastructure providers, the power generation folks, the grid providers. We're giving lots of of feedback all the way up the up and down the supply chain. That yeah, that Oracle database on GPU thing is it feels massive to me.

It feels like very significant like immediately uh very tangible, you know, performance benefits. I can imagine. I I haven't seen the actual stats, but I imagine it's extremely uh is it's going to be more efficient or faster.

Um but uh how do you balance marketing like sort of stories like that, stories about the AI buildout with stuff that's more futuristic, more further out? There was a lot of uh debate on the timeline this week about uh StarCloud. Yeah.

How fast GPUs will be in space and everyone's saying like look like we love this founder. We're having him on the show actually next week. It's a really exciting project, but I think everyone kind of agrees that this is not going to happen tomorrow.

And the and the the rendering the the the the sci-fi, it's more of a sci-fi movie than they put out shows like, you know, whole truckloads of GPUs moving around in space. How do you think about balancing that from the Nvidia brand perspective?

Well, from our perspective, you know, we recognize that like I said, there's a lot of interest and investment.

like we talked about that three to4 trillion dollars that's going to be invested in AI infrastructure and of course that infrastructure is going to be deployed you know in lots of different areas um obviously as we've talked about how a lot of the concentration historically has been around you know high popular highly populated dense areas but now we recognize you know AI factories can exist anywhere they can exist anywhere there there's where there's you know cheap clean power and so obviously space is some place where there's lots of renewable energy so you know I suspect ect that will certainly be a place where where data centers will land eventually.

But right now, you know, we're really focused on how can we help build um the infrastructure that's right in front of us and and help really deploy that in the most efficient performant way as possible.

And of course along the way make sure that we're adding value because that's really what what we're here to do um for not just ourselves but the broader ecosystem. You would you would laugh. We were running the numbers on building a hearth or a woodpowered data center server here at the studio.

We think we think it's surprisingly doable. Like out of like a normal fireplace, we believe you could potentially power eight H100's and fine-tune GPT OSS however you want it over the course of a day.

Basically, we haven't we haven't we haven't we want to be we want to be the first Nvidia customer to be focused on the wood powered data center. Yes, there's something our own part of the market. It's very cozy to chop firewood and uh and and boil water as they did hundreds of years ago.

Well, I'm I'm super excited for for GTC next week. Yeah, this is great. Thank you so much for giving us a little preview. For giving us the preview and have a great weekend. We'll talk to you soon. Great hanging. Appreciate it. Thanks a lot. Before we bring in our next guest, let me tell you about public.

com investing for those that take it seriously. They a multiass investing industry yields and they're trusted by millions. Soon billions. Um, I think we might actually have a one minute break before our next guest comes in. There's no one in the reream waiting room. Or is there? No, not yet. So, give us a post.

Jordy Hayes. If someone's smart, if someone is smart but has bad aesthetics, normies will not take them seriously. This is good. Aesthetic aesthetic contains real information. This post is from defender of basic.

uh that this category of smart people can't read and if they gain power, these blind spots will lead to their undoing. I completely botched this reading. This is a mess. Let's Here's another one.

Alex Denko uh Sundar Pachai said yesterday or a couple days ago, new breakthrough uh quantum algorithm published in nature today. Our Willow chip has achieved the first ever verifiable quantum advantage. Willow ran the algorithm which we've named quantum echoes.

13,000 times faster than the best classical algorithm on one of the world's fastest super supercomputers. This new algo can explain interactions between atoms in a molecule using nuclear magnet magnetic resonance, paving a path toward potential future uses in drug discovery and material science.

The results is verifiable, meaning its outcome can be repeated by other quantum computers or confirmed by experiments. This breakthrough is a significant step toward the first real world application of quantum computing and we're excited to see where it leads. Again, see this positioning right?

It's positioned as an internal science project. Of course, he's saying it's a step toward the first real world application of quantum. Did you see what Martin Scley said? What did he say? Contrived results. Still not faster advantaged. He's like so bearish. Alex Denko though says I'm okay.

I'm sorry if this is a dumb question. Congrats. And I think I I when I see this and I see the algorithm and I see what like how they're talking about it.

It really does feel like yes over the next decade like there could be some really really powerful niche use cases that are extremely it's a good it's a good quantum computer sir. I agree. I agree. But one more post before we did you want to explain why quantum computers look like that?

Because that's what Alex Denko is asking and there is an answer. Um it's uh because they're uh they need to be cooled. They need to be really cold. And so you have to hang it instead of attaching to the floor. Each layer of the chandelier is colder down to the last one where the quantum computation is taking place.

And so you see this