Parag Agrawal's Parallel raises from Sequoia as AI agents drive explosive web infrastructure demand
Apr 29, 2026 · Full transcript · This transcript is auto-generated and may contain errors.
Featuring Andrew Reed & Parag Agrawal
Speaker 1: was the estimate. And for Amazon, 181,500,000,000 against one seven seven estimate. So beats across the board, but various levels of expectations. We'll be digging into the CapEx numbers and all those other downstream metrics that we discussed at the top of the show. But we have our next guest in the waiting room. We have Prague Agrawal, and I believe we have Andrew Reed as well. Welcome to the show. Hey, guys. How are you guys doing? What's going on? Thank you so much for taking the time. Let's kick it off with what you got done this week.
Speaker 7: We got Andrew Reed on the board. Let's go. Go.
Speaker 2: Very very very very good.
Speaker 1: Yeah. How did the deal come together? What's the thesis? What's the update? What's the progress like? Sort of reintroduce the shape of the company.
Speaker 7: John, Jordy, great to be here again. I don't know. Last time I was here, we were building we were talking about this abstract notion of AI agents using the web and Yeah. Wanting web infrastructure so that agents can use the web. Yeah. Right? And it's been six months Yeah. And the agents have shown up. Yeah.
Speaker 1: 100%.
Speaker 7: And these agents are using the web. Yeah. And we've built a bunch of technology and productized it in the last six months. We only launched the product in August. Mhmm. And we're seeing agents use the web to do useful things for people. Yeah. And useful things for businesses. Mhmm. And really, I don't know how many agents you use. Yeah. But like this year, the the the breadth of customers, the quality of customers, the variety of use cases, they've all exploded well beyond our wildest imaginations. Yeah. And And we should start with, you know, the
Speaker 2: Well, let's just go around and say how many Mac minis we all have. I personally have 20 running. Oh, yeah. You do? John, I know has I have a single one, single Mac mini,
Speaker 1: barely doing anything. Continue. No. I I I'm interested to know about just well, two main things, but let's start with the first. In terms of just like robots. Txt, how much of the Internet can be seen by agents? Is this a wall of some sort? We've seen different reports about different organizations with content on the Internet having different policies. And is is are customers coming to you saying like, I want to be a good actor and so I don't want to violate any of the robots.txt agreements even though they are not necessarily legally binding in every scenario as I understand it. But is is is seeing the entire web, being able to do everything across the web a unique value proposition in this day and age?
Speaker 7: It will be. Right? If you think about it, you're there's a lot more content on the open web. Mhmm. I think more recently as people have started worrying about, like, agents sort of stealing their content. Yeah. Certainly a fear and perhaps a trend in some pockets to put content behind Yeah. Vaults. Yeah. Whether it be wirerobots.text or some sort of a blocker. Yeah. Like, that is a legitimate fear. Yeah. And see some of it happen. We actually started the company to, in part, solve the problem. The common thread from my work at Twitter to through this company was to incentivize more content to be out in the open for everyone to see and use. Mhmm. So we've also been working with people, content owners, publishers who publish on the web Mhmm. To effectively align incentives of theirs for the world of AIs. Mhmm. So when we talk about rebuilding the web for AI, it's not just about building technology to serve people building AI solutions. Mhmm. It is also to empower content owners and publishers Mhmm. To actually have sustainable, real, awesome, fast growing businesses Yeah. That share in the value that they help create YIAIs. Yeah. And we'll have more to share perhaps, and and we'll be back to talk about it more. Yeah. How are you thinking about the various of
Speaker 2: He's vague posting on the on the On the show. On the show. He's vague posting He's fishing for another invite.
Speaker 1: How how yeah. How how are you thinking about the various modalities with which to interact with Internet resources? Like, we have APIs. We have web front ends. Agents can use both. There's MCP servers. There's potential for an MCP standard that's know metered with stable coins or even just credit card payments. There's there are solutions to every different end point. Some of them require more RL and training. Like, are you do you see any of them as a point of differentiation where you want to be like the best for setting up an API integration just on the fly seamlessly so you can get the action done? Do you want to just be able to deal with any front end no matter how janky it is from the maybe they haven't updated their website since 1995. It still works. Maybe it's using some crazy front end framework. It still works because you're really good at agentically interacting with the UI.
Speaker 7: Yeah. No. I think so. A few different things here. Right? Number one, you do have to take websites or content published for every era and allow agents to access them. Yeah. So a bunch of the heavy lifting we do is to enable that to happen today. Mhmm. Though, none of those are the most efficient ways of doing things. Right? And ultimately, the world wants to move towards more efficient solutions because that allows you to do more and do better. Mhmm. And so we built a bunch of things where content owners can provide data more efficiently. People saw lms.txt get around a little bit. Very small part of the web uses it, but we translate the entire web to make it easier for agents to use it. Mhmm. Now in terms of APIs and MCPs and whether you pay for an API with a traditional credit card in an account or use a crypto protocol, whether it be export to tempo, these are all all of these things make sense. Right? They make sense for different customers, for different agents, and different contexts. Yeah. And really, the way we think about sort of solving the problem is we're building essentially an infrastructure layer. Even if you look at our product suite, it is very, very it exposes various layers of our technology stack. Mhmm. We're building this vertically integrated technology stack. We have first principles. We have crawling the web, indexing the web, ranking the web, building agents on top of it, building proactive agents on top of it. Yeah. And exposing all of these things as APIs and MCP servers and as CLIs because different customer needs and different customer problems and different customer circumstances Yeah. Demand slightly different solutions. It's like AWS exposing a VM Yeah. Versus a managed service to run compute Sure. Versus even higher level abstraction to run serverless. Mhmm. And it's the same that shows up in this infra environment of agents using the web. What companies would be crazy
Speaker 2: to not sign up for the service today?
Speaker 7: If you're using an LLM and are not trying to get it to forget everything that's out on the web, you should totally be using it. Mhmm. In the end, it's like it's not complicated because Everyone. Yeah. And it's not complicated because Yeah. If you think of, like, any products that you use which are powered by LLMs or agents, you have this mental model now Yeah. That they are both smart and all knowing. Mhmm. And if you don't give them the best access to all of the live fresh content on the web and all of the long tail content on the web Mhmm. Like, that illusion of all knowing goes away. Mhmm. And so you kind of are forced like, initially, ChargeGPT, you're all in love with it. Did not have web. Mhmm. Now, try to turn the web off Yeah. For a day. Yeah. In whatever agentic product I use. Yeah. Coding agents, we all used to use them without they did used to not crawl the web. Don't know if you remember, but I used to, like, paste and talk links for it. Yeah. Yeah. Yeah. Yeah. Yeah. I used to, like, export PDFs and then, upload the PDF from the website, but you wanted to add extra context into the context window. And now, you just just assume that it knows everything. It's crazy. And that was a year ago. That was a year ago. And today,
Speaker 1: you're gonna feel like a caveman doing that. Okay. That's what We have to talk to Andrew because the chat is calling you a silent partner. They're saying nothing from my end. Thanks. I wanna know, did the progress here with this company, did you expect it or did it hit you like a flashbang? Fooling flashbang. Talk me through your journey. Well, I'll let you take that one too.
Speaker 3: Okay. No. No. No. No. Off. Off. Allow me to allow me to insert myself. Please. Since the last time I was on TVN Yeah. When you guys started going down the, like, the deep AI infrastructure rabbit hole, I saw how the show changed and I'm unprepared. So my the the anxiety level increased quite a bit, and You're good. I've I've been in some conversations with Parag with other people who are, like, crazy deep on web infra. Yeah. However deep however deep Jon can go, I think Kurok can go even deeper. So I feel good about that. Good. Yeah. First off, I'm delighted to be on the show with my newest award, and it's incredibly exciting company. Honestly, the I think like many of our most interesting investments, like, when you start seeing a bunch of portfolio companies adopting the same product and and also in these high growth areas, you see, you know, you see background agents and long horizon agents out in the wild to a certain degree. But then you have the board meetings where you see all the prototypes and the products people are planning on launching the rest of this year. Sure. And I think the idea of agents going from something that runs locally and has very limited access to something that runs in the background all the time and has basically a full workspace. And then you think about all the albums of knowledge work and how they overlap with the web in terms of things changing in the world to trigger actions to the actions that agents have to take in the world. And then people are viewing parallel as kind of core to what they wanna do Mhmm. Is, you know, at some point, like and obviously, growth rate's crazy, so it's not that complicated.
Speaker 2: Like it. That always sounds. Icing on the cake. Yeah.
Speaker 1: Is this is this a period of uncertainty in venture broadly around how certain markets will break, where powers and moats will emerge? Or do you think the smoke is clear? Do you think that it's actually pretty pretty clear right now? Well, have Sequoia. Have the crystal ball. You have the crystal ball? I'll consult with that if they need to if if if there is some fog of war, you know? Yeah. I mean, there is a lens that's like it's up only right now. Like, it's just such a big market that even the companies that aren't the hottest one day Yeah. You look at I mean, just like how you're processing the market right now. You look at CodeGen and like a year ago, if you just backed the top like, eight companies They all did well. You've done really well. Yeah.
Speaker 3: I think Fortisworth, the, you know, there is one sure thing, which is the data center build out. It's underway and Yeah. You know, what that's gonna mean for token pricing and what's gonna be possible and, obviously, you know, the model improvements, etcetera. Yeah. It's interesting because, you know, I was watching Vlad before this, and I remember Oh, yeah. Interestingly, there's something similar with, you know, when I first met Parag. It was in Palo Alto. You know, Internet celebrity building something that looked like one company from the surface, and then the further you got into it, you realized they're just doing so much more than meets the eye, you know. And and I remember with Robinhood, you know, according to the Internet, it was, you know, commission free stock trading app, but they had just built this internal self clearing system, the first one that built US in thirty years, I think, since Vanguard. And their willingness to tackle long term, very hard engineering problem problems to serve their needs and their customers' needs for a long period of time was extremely unique in that industry And it reminds me of parallel, like, just if you go spend time in the Palo Alto office and you see the stuff they're working on, these are hardcore infrastructure engineering problems that are designed to serve both scalability, reliability, speed, everything else, cost, needs, or customers a year or two from now, and, like, the world just doesn't quite see it yet. And then also, like, I remember when Parag first mentioned, you know, agents are gonna use the web a thousand times more than humans, and it, you know, you kind of roll your eyes because, like, that just seems ridiculous. Yeah. But then if you just actually look at the trends and just Yeah. Draw a line forward and imagine, like, what cloud agents are gonna be and how they're gonna overlap with knowledge work, It actually is probably like an undershooting of the number. Totally.
Speaker 2: Yeah. That's personally my bear case, thousand. I'm yeah. I'm I'm expecting,
Speaker 1: like, million. We're seeing, like, uptime problems with major pieces of web infrastructure because the demand's so high. It's really showing up everywhere. Chip shortage, CPU shortage, so many different places. And yeah, I mean we we we saw early glimpses of it with you know certain just the number of queries a single deep research report would make or something like that. So