Brad Lightcap on GPT-5's cross-industry impact, productivity gains, and OpenAI's enterprise adoption strategy
Aug 7, 2025 · Full transcript · This transcript is auto-generated and may contain errors.
Featuring Brad Lightcap
products, but I'm sure those are edge cases to date, but but where do you feel like it's going based on on your user base?
Yeah, I mean, it's going to go one direction or the other. Product managers are either going to develop the hard skills to do the design, the go to market and the engineering job to some extent because some of these other jobs are definitely going away for product managers or my favorite use case engineers and designers are going to get tools like chat pd or these prototyping tools or cursor and they're going to be able to actually do the product management job. And so what I think is we're going to see a new type of role emerge which is a much more generalist role where people maybe have a specialist capability and they're augmenting that product thinking or they're augmenting that technical thinking with with AI. But I don't think there's going to be product managers as they were, you know, five or 10 years ago for much longer.
Makes sense. Well, thank you so much for stopping by. Great. Great chatting you.
Thanks for having me.
We'll talk soon. Bye.
Cheers.
Up next, we have Brad Lycap, the chief operating officer of OpenAI. Welcome to the stream, Brad. Also, Jordy, your post saying, "I'm updating my timelines. You now have four years to escape the permanent underclass has over 4,000 likes."
There we go.
Absolutely banging.
Thousand likes for every year. Love to see.
Love it. Anyway, Brad, how you doing,
Brad? What's going on,
guys? How are you? Good.
Congratulations on the launch. Um, what uh what are the biggest takeaways for today from your side? I'd love to know about what it actually means to be the COO of Open AI. OpenAI does so many different things. Consumer internet company, API business, enterprise, there's all sorts of stuff building data centers. What What is your actual role?
My role is kind of whatever the company needs me to do. Um I play everything from like you know PM when I need to to like uh you know saleserson when I need to. Um uh that's kind of the fun part of the job for me. Um on this launch in particular uh it was really fun. I spent a lot of time last few weeks with customers with partners. uh getting a feel for GPD5 relative to what they were previously using. In some cases, those were OpenAI models. In some cases, they were other models. Um but you know, I've been OpenAI a long time. I've been in OpenAI 7 years. Uh so I've seen GBT3, I've seen GBD4, and then to be able to see GBD5. Uh and you know, just I think the joy from of people being able to use it in production and seeing how much better it is. Uh that's the best part.
Greg told us earlier about the era having to pay people to use the early versions of the product. You guys have come a long way since then.
Yeah, we had like three customers with GPT3 or something like that. Um, and so it was easy to manage, easy to talk to all of them. They actually were like tired of us calling them being like, "Is it is it good? Is it getting better?" Um, and uh, so now it's, you know, we're fortunate that we've got more than that. But, um, it's cool. I mean, the diversity of use cases, I think the number of things that people, uh, are able to use it for. We've got everything from the team at Amgen, you know, big pharma life sciences using it for, uh, clinical workflows there. We've got teams at Uber's, uh, you know, building it for customer support. Um, teams at Notion and Cursor building it into products that people use every day. So, uh, I think that's the power of it is is it just more and more covers the surface area of things people do uh, you know, with these tools. I'm not sure how much you touch organizational design at OpenAI, but I'd be interested to hear your thoughts on how those companies that you mentioned should be thinking about AI changing their org structure. Is it sort of like a horizontal crossf functional service layer like um you know a finance team that touches a lot of different elements of the business or should most companies be thinking about standing up a dedicated like AI implementation team? How do we get a a chat box on every product that we already ship? Like how do you think about those trade-offs if if you were talking to a you know a friend at a Fortune 500 company that was thinking about their AI strategy?
Yeah, you know, it's an interesting question. Um I think it was maybe said earlier on the show. The thing we see is just people can do more. And so there's like this much wider latitude that you get if you're an individual person at an individual company where especially as you get bigger, you know, maybe more bureaucratic organizations that have a lot of different functions, a lot of different levels, you have to rely on a lot of other people in the org to get stuff done. You've got to rely on your data science team to do data analysis. You've got to rely on your design team to do mockups. You've got to rely on your marketing team to do copy. And I think what we see with AI is it just accelerates people to get to a great V1 of everything. Mhm.
So if you're a high agency individual uh and you want to get stuff done um you're no longer gated on people that uh you know you otherwise would be and I think that should enable organizations to move a lot faster and I think it should enable the the people at organizations that really drive them uh to do a lot more and we see that consistently chatbt enterprise I think that is consistently what we hear and we we seek those people out when we deploy chatbt enterprise um we find those like you know two or three people at the organizations who are just the like AI superstars and champions and then try and actually use them as these kind of touch points for the rest of the or to learn from.
How are you you how are you personally using AI these days?
Uh you know I my biggest challenge I think day-to-day is context switching. If you look at my calendar from like top to bottom it's like I I joke like I you know with my wife I like have to like go show up to work like wearing like a lab coat and then I like take the lab coat off and like put some like sunglasses on and a film school jacket and you know then I'm talking to like a media company and then I like take that off and so so I go through the costume changes. Um, and I think what what I actually mostly use it for is just to help with bridging me from kind of thing to thing. Um, to kind of put me in the mindset of being able to work with customers, help customers. Um, GBD5 is incredibly good at this kind of structured reasoning of how do we actually take what is this very diverse set of things that models like GBD5 can do and then apply them in domains that I don't think about every day. And so it gives me this launching off point to be able to talk with with uh with leaders and with customers much more fluently about how we can help their organizations
within let's say a set of companies like the Fortune 500. What is AI adoption look like across the spectrum? because I'm sure that there there's companies that you talk to that are truly, you know, adopting AI in the way that John was mentioning like trying to become AI native, changing their entire organizational approach. And then there's companies that just want to buy software to say that they can that they're becoming AI native. So what what is that spectrum look like in in practice?
Yeah, it is a wide spectrum. So um at the top level we're seeing just like amazing uh appetite for wanting to adopt tools for people and I think that's like the easiest place to start. Typically that's where we steer organizations uh if they're starting at zero is just give your people the best tools. Um you may have seen we you know we've grown chatbt uh work which is our enterprise and team product from 3 million seats to 5 million seats now uh from from June till till now. So um torid growth there and we don't see any abatement in in demand there. Um if anything it's accelerated from from last year and so I think people and organizations are starting to realize that like at a minimum you need to make sure people have the best tools. Um what's cool about GBT5 now is it also enables people to use the best tools at every point. And so if you're in an organization you're not fumbling with the model picker. You're not trying to figure out when to use a reasoning model. You're not trying to figure out kind of the art of prompting to get the perfect thing. all of that stuff is abstracted and it's kind of taken care of for you and you can have confidence that your people are actually using the best models at any at any given point. Beneath that it gets a little more complicated. So um more and more organizations I think are starting to grasp uh how the tools can actually help uh in the business process. So whether that's in customer support, whether it's in research, whether it's in uh software engineering and data science, um you're seeing these tools more and more adopted in the enterprise. I think there's still a quality gap though. I think we've we now are just breaking into what I would call the kind of era of models that have capabilities that are good enough to make a dent in the types of problems businesses care about. Um, businesses care a lot about things like reliability, right? They think they care about accuracy. They care about the resil the resiliency of the model to recover from tool use errors and to be able to string together these very long kind of multi-tool multi-step workflows. Um so GPD5 is a step on all those things and I expect that that will enable us to be able to do more and more things in the business process. Do you think those customers that you just mentioned will stick with uh um this idea of like GPT4 level workloads will stay on GPT4 and maybe there'll be cost savings but those workloads will stick around for a very long time and then you'll develop almost new capabilities, new workflows, new uh new workloads that will be additive but um the enterprises will stick or will they want to is is everything so fresh that they'll want to just like rewrite everything with the latest and greatest.
More often than not, I think it's the latter. I think you want you want to rewrite everything. Um, one of the cool things we did here was we were able to keep the pricing on GPD5 at the level of 03 pricing. And so, um, you know, if if you're cost sensitive, uh, you don't really have an excuse to to not upgrade. GBD5 is faster than than 03 and and 41. So, we've improved on latency for sensitive use cases that are speed sensitive, latency sensitive. Um, and obviously the intelligence bar has gone up. So, um, you know, unless you've got really a very kind of narrow and specific workflow where you've got a model like 41 that kind of is okay, there's really not a reason, I think, that people wouldn't upgrade.
Yeah. Do we need like a three-dimensional paro frontier right now that matches not just cost and and capability, but also cost, capability, and latency or something? Is that is that something that you're seeing a lot of demand from in the enterprise?
Yeah, 100%. Um, we actually measure it that way. So we we look at those three vectors and it's always kind of an optimization function along those three those three uh those three axes.
We think we found that here uh it was actually in terms of where my work was over the last few weeks. It was a lot of I mean there's a qualitative you know kind of you know really like manual process of collecting feedback because um everyone's got a little bit of a different preference and we can only pick kind of one or two points on that curve and so just trying to kind of dial customer feedback namely developer feedback in for us on where that that balance of things are. um is a big part of our our process for picking uh picking all those all those points. Uh and so we hope that we hope that people like it and it unlocks uh you know the kind of maximal use.
That's great. Jord,
how are you thinking about uh open source? Who you know, who's been most excited to get access to it and uh yeah, where do you see it going?
Yeah, I mean it's important to us. Uh you know, I'm glad we've uh we've gotten this out. It's been a huge team effort. Um, I think there was a kind of a thing that like, you know, Open AAI doesn't like open source anymore. It's like, no, we're just like really busy with a gazillion other things. Um, so I think hopefully going forward we've got more of a a a leaned in vantage point on on on open source. But um, it it unlocks a huge number of use cases. I mean, if you think about kind of like uh, you know, government use cases, uh, you think about on-prem, you know, use cases where you're you're handling uh, sensitive data in very sensitive environments, you think about where you want to run models on the edge. Um all these things right now are kind of inaccessible to us as a service provider to customers um because we just don't quite have models that kind of fit at those points. Uh so this for us we think is huge TAM expansion and uh we're excited to be able to work with enterprises on on implementing that model which is is I think you know competitive hopefully with with our O3 class of models. So
what is the landscape like for companies that are helping to implement OpenAI products at various enterprises? you have the, you know, big consulting groups that will give you an AI strategy. Maybe they'll try to take it a step further, but I imagine there's a cottage industry of of uh, you know, firms that have sprung up to try to help organizations unlock the value beyond, hey, let's just get everybody a seat with uh, chat work. Yeah, I think there will be this new industry that emerges um that is kind of separate and apart from kind of the legacy set of SIS and uh um you know consultants uh that is really AI fluent. They're very AI native. I think um it's very hard to borrow I think paradigms from the last 20 years of software building and uh you know implementation um that are going to kind of map to what we're dealing with here. um you're dealing with fundamentally probabilistic systems uh that are moving and increasing and improving at a rate you know of of now kind of collapsing to every few months um and I think this the the the nature of use cases changes quickly um where enterprises are focused on kind of deploying them changes quickly and so uh I think it's just hard for kind of the legacy industries to keep up frankly um we've had a lot of success working with some of this kind of new breed of SI so the distills of the world and others that um really have been born, I think, in forged in the fire, so to speak, of uh of this kind of new um this new platform. And so, we hope there's more of them. Um we we we'd be excited to work with anyone that uh that wants to work with us on it. There's more business than we can handle. And so, um we're always happy to to spread the love.
Talk about the uh $1 uh chat GPT product for uh the government. Were you involved in that at all?
I was involved in that. Um we we wanted to do something that was meaningful for for US government. Um it's been a a real big focus of ours lately. Um I think our our view is the government has got to start to modernize. Um we've got to make sure that the tools that we use in the private sector are also in the hands of folks serving us in the public sector. Um and we wanted to make that really simple. So um we made chach you know basically equivalent to chach enterprise free. Uh it's a dollar per year per agency. Hopefully we can afford that. Um, and uh, we wanted to make that uh, you know, available to anyone that wanted to use it uh, and standardize through GSA. So, we're super appreciative of the partnership with them. Uh, and more I think that we can do on that front.
How is that different than just like if I'm a government employee, I can just go to google.com and I have access to that and Google right now. I mean, Scott Kapor was saying that he can't use it. Yeah. So, yeah. Why? Yeah. Yeah. just just talk to me about how how it's different to to offer uh chat as an actual service with a contract that you're that you're uh you know vending in you're actually they are a client versus just if you put up a website every government employee can access the web to some degree or would it be blocked like what why does it need to be like a deal at all as opposed to just like everyone just uses it?
Yeah. So um part of it is just making sure that government employees can access it. So in some places obviously you know you can put blockers in place that wouldn't prevent access.
We hear a lot of stories by the way of people like going out on their lunch break to their car in the parking lot and like you know pulling up catchy on their phone and like throwing a bunch of stuff in there just to like because they know it'll get them through the day faster. And
um we've done work by the way with governments with the state of Pennsylvania and other places where we've seen dramatic increases you know things like two to three hours a day saved per employee um given the nature of the work that they do and and how helpful chatd can be. And so this lets us have an interface into them as a customer. It lets our team engage with them in a a direct way. We can see how they're using the product and can help them use it better. Um, and so that's that's important for us is like we got to build on that foundation with them.
And then presumably it also allows the the government to define like security and privacy in their world as opposed to if you're just like some website out there. They their choice is only block or don't block as opposed to actually uh you know communicate with you. This is okay to train on this is not etc. like keep everything private, etc., etc.
Yeah, I mean, we don't we don't train on on on enterprise data um at all. So, yeah, you know, you're safe there, but um the Yeah, I mean, for us, like just being able to uh to treat them as a customer, right? To treat them as a user and um you go, you know, you mentioned earlier like we were talking about kind of like there being these points of success at every organization that um you know, you've got people who are like way more sophisticated and using these tools than others. Um we want to be able to see those people and amplify them and the government's no different. Um, there are people that we've worked with in government who are incredibly sophisticated in how they use AI tools and our goal is to get everyone there.
How do you think about the group of users that are active students? They've been on summer break. You guys have been busy over summer. Are you thinking about uh and and you recently launched uh I forget the exact name for the product. I think it was like chatbt learning. How are you thinking about that cohort and unlocking new capabilities for them uh this coming year? Yeah. So we launched something called study mode um which was in our core chatbt product and um it was a little bit of an experiment. We wanted to see if you change the way the model behaves when it can kind of uh when it knows you you want to be in a learning mode um if that can actually enhance outcomes for students. Um where we we have all these kind of studies that have been done very like anecdotally about ChachiPT's ability to um to drive student outcomes and learning outcomes. Um, so here we kind of took a little bit more of an intentional approach of if you actually model the take the model and actually use it in a more Socratic style where it can actually kind of quiz you, it can withhold certain information that it wants you to be able to to empirically deduce. Um, it wants you to reason about problems and it kind of reasons with you as a partner. Um, so far so good. Uh, it's it's really cool. Um, and learning is kind of the killer use case of TACPt. Uh, and so I think you know to be able to actually launch something that is in some sense extends that kind of killer use case. um is has been really cool and and the student feedback so far, even on summer break, has been positive.
Well, we'll let you get back to your day. What's what's next on your agenda? Are you putting on the lab coat or the suit and tie and going to Washington?
Uh good question. Uh you know, today I'm I'm mostly with the team uh and talking to customers and um uh maybe tomorrow I'll get back to the lab code, but uh in the meantime,
I appreciate you taking the time to talk today. So,
yeah. Well, thank you so much for taking