The a16z Show

Capital, Compute, and the Fight for AI Dominance

58 min
Feb 19, 2026about 2 months ago
Listen to Episode
Summary

A16z partners Martin Casado and Sarah Wang discuss the unprecedented capital flows, talent wars, and structural changes in AI investing, exploring whether frontier model companies can consume the entire application layer through superior fundraising capabilities. They examine the blurred lines between venture/growth investing and infrastructure/application companies, while addressing concerns about circular funding and the sustainability of current AI economics.

Insights
  • AI companies can now raise capital and deploy it into compute to achieve breakthroughs within a year with small teams, creating an unprecedented capital flywheel unlike traditional software development
  • Frontier model companies may be able to raise more capital than the aggregate of all companies built on top of them, potentially allowing them to consume the entire application layer
  • The current AI boom differs from the dot-com era because there are 'no dark GPUs' - every dollar invested in compute has immediate demand, unlike the fiber overbuilding of the internet era
  • Traditional 'boring' enterprise software companies are underinvested as VCs chase AI growth stories, despite offering solid returns in large markets
  • The talent war magnitude is unprecedented, with individual contributors receiving offers in the tens of millions, fundamentally changing startup economics
Trends
Blurring lines between venture and growth stage investing due to large early-stage AI roundsModel companies vertically integrating into applications while maintaining API businessesCircular funding patterns between strategic investors and AI companies they provide compute toConsolidation of AI talent into fewer, better-funded companiesShift from engineering-constrained to capital-constrained competitive dynamicsGeographic re-concentration of AI startups in the Bay AreaCustom ASIC development becoming economically viable for billion-dollar training runsFounder movement and talent poaching at unprecedented scalesEnterprise software automation through AI coding assistantsSpecialization vs generalization debate in AI model development
Companies
OpenAI
Discussed as leading frontier model company with ChatGPT and custom silicon partnerships
Anthropic
Highlighted for Claude models and enterprise focus, competing with OpenAI
Character.AI
Example of founder movement when team was acquired by Google for IP licensing
Cursor
Success story of AI coding assistant that built custom models for developers
World Labs
Fei-Fei Li's 3D scene generation company where Martin Casado contributes code
Thinking Machines
A16z portfolio company discussed regarding recent founder changes and future prospects
Meta
Led 2024 talent acquisition spree, paying billions to poach AI researchers
Google
Acquired Character.AI team and competes for AI talent with other tech giants
Nvidia
Referenced for GPU supply and custom ASIC economics comparisons
Broadcom
Mentioned as OpenAI's custom silicon partner for ASIC development
ElevenLabs
Example of specialized AI company maintaining market leadership in audio
Scale AI
Cited as horizontal solution from the autonomous vehicle wave
GitHub
Referenced as example of large developer tools market that A16z previously invested in
Zipline
Mentioned as example of hardware company in A16z's portfolio
Applied Intuition
Example of horizontal technology solution from autonomous vehicle era
People
Martin Casado
A16z general partner and creator of software-defined networking, codes for World Labs
Sarah Wang
A16z general partner focused on AI growth investments and model companies
Alessio Fanelli
Latent Space podcast host and Kernel Labs founder interviewing the A16z partners
Sean Wang
Latent Space podcast editor conducting the interview with A16z partners
Noam Shazeer
Character.AI founder who moved to Google, example of talent movement in AI
Fei-Fei Li
World Labs founder focused on spatial intelligence, where Martin Casado contributes
Ilya Sutskever
Safe Superintelligence founder, example of exceptional AI research talent
Mark Zuckerberg
Led Meta's aggressive AI talent acquisition strategy in 2024
Elon Musk
Referenced for driving robotics investment through humanoid robot development
Michael Truell
Cursor founder praised for focus on building rather than responding to noise
Quotes
"Very rarely can you see someone get poached for $5 billion. That's hard to compete with."
Sarah Wang
"There's no dark GPUs. Every dollar going into compute has demand on the other side."
Martin Casado
"A model company can raise capital, drop a model in a year with a team of 20 and produce something with immediate demand."
Martin Casado
"If Frontier Labs can raise three times more than the aggregate of every company built on top of them, they may consume the entire application layer."
Martin Casado
"I've never seen the perception of the truth be further from the truth industry wide ever."
Martin Casado
Full Transcript
5 Speakers
Speaker A

I mean, every industry has talent wars, but not at this magnitude. Very rarely can you see someone get poached for $5 billion. That's hard to compete with.

0:00

Speaker B

It's almost become a meme, right? Which is like if you're not basically growing from 0 to 100 in a year, you're not interesting. Which is the silliest thing to say.

0:07

Speaker A

When there's a real capability breakthrough, the demand is there and so the revenue growth is much faster than we've ever seen. Once it's turned on, there could be

0:14

Speaker B

a systemic situation where the soda models can raise so much money that they can outpay anybody that builds on top of them. Which would be something I don't think we' ever seen before. Just because we were so bottlenecked on engineering.

0:23

Speaker C

During the Internet build out, investors put money into fiber that nobody used. Four years of supply overhang followed. This time there are no dark GPUs. Every dollar going into compute has demand on the other side. But something else is different. A model company can raise capital, drop a model in a year with a team of 20 and produce something with immediate demand. If Frontier Labs can raise three times more than the aggregate of every company built on top of them, they may consume the entire application layer or the market FR and value accrues the companies closest to the end user. Nobody knows which path wins. In this conversation previously aired on the Latent Space Podcast, Martin Casado and Sarah Wang, General partners at A16Z, speak with Alessio Finelli and Sean Wang about the capital flywheel talent wars and why boring software is under invested and whether every task is AGI complete.

0:36

Speaker D

Hey everyone. Welcome to the Latent Space podcast. Live from a 16Z, this is Alessio, founder of Kernel Labs and I'm joined by Twix, editor of Latent Space Space.

1:28

Speaker B

Hey, hey, hey.

1:36

Speaker E

And we're so glad to be on with you guys. Also, a top AI podcast. Martin Casado and Sarah Wang, welcome.

1:36

Speaker B

Very happy to be here and welcome.

1:44

Speaker E

Yes, we love this office. We love what you've done with the place. The new logo is everywhere now. It's still getting. Takes a while to get used to, but it reminds me of like sort of a call back to a more ambitious age, which I think is kind

1:45

Speaker B

of definitely makes a statement. Yeah, yeah. Not quite sure what that statement is, but it makes a statement.

1:59

Speaker E

Martin, I go back with you to Netlify and you know, you created software defined networking and all that stuff. People can read up on your background. Sarah, newer to you, you sort of Started working together on AI infrastructure stuff.

2:05

Speaker A

That's right, yeah, seven, seven years ago now.

2:19

Speaker B

Best growth investor in the entire industry.

2:21

Speaker E

Oh, say more.

2:23

Speaker B

Hands down, Sarah. I mean when it comes to AI companies, Sarah I think has done the most kind of aggress thesis around AI models.

2:24

Speaker D

Right.

2:35

Speaker B

So she worked with Noam Shazir, Mira Ilia, Fei Fei and so just these frontier kind of like large AI models. I think, you know, Sarah's been the broadest investor. Stephen.

2:35

Speaker A

No, I, well I was going to say I think it's a really interesting tag team actually just because the, a lot of these big C deals, not only are they raising a lot of money, it's still a tech founder bet, which obviously is inherently early stage. But the resources, I was going to say the resources. One, they just grow really quickly, but then two, the resources that they need day one are kind of growth scale. So I, the hybrid tag team that we have is quite effective I think.

2:48

Speaker E

What is growth these days? You know, you don't wake up if it's less than a billion or like

3:14

Speaker B

it's actually, it's actually very like, like. No, it's a very interesting time in investing because like you know, take like the character around, right. These tends to be like pre monetization but the dollars are large enough that you need to have a larger fund. And the analysis, you know, because you've got lots of users, because this stuff has such high demand, requires you know, more of a number sophistication. And so most of these deals, whether it's us or other firms on these large model companies are like this hybrid between venture and growth.

3:20

Speaker A

Yeah, totally. And I think you know, stuff like BD for example, you wouldn't usually need BD when you were seed stage trying

3:45

Speaker E

to get biz dev.

3:52

Speaker A

Biz dev, exactly. But like now I'm not familiar, what

3:53

Speaker E

does biz dev mean for a venture fund? Because I know what biz dev means

3:55

Speaker A

for a. Yeah, you know, so a good example is, I mean we talk about buying compute, but there's a huge negotiation involved there in terms of okay, do you get equity for the compute? What sort of partner are you looking at? Is there a go to market ARM to that? And these are just things on this scale, hundreds of millions, you know, maybe six months into the inception of a company. You just wouldn't have to negotiate these deals before.

3:58

Speaker B

Yeah, these large rounds are very complex now. Like in the past if you did a series A or a series B, like whatever, you're writing a 20 to a $60 million check and you call it a day. Now you normally have financial investors and strategic investors and then the strategic portion always still goes with like these kind of large compute contracts which can take months to do. And so it's a very different time. This, I've been doing this for 10 years. It's the. I've never seen anything like this.

4:22

Speaker E

Yeah. Do you have worries about the circular funding from Sony Strategics?

4:47

Speaker B

Listen, as long as the demand is there, like the demand is there. The problem with the Internet is the demand wasn't there.

4:52

Speaker E

Exactly. All right. This is like the whole pyramid scheme bubble thing where it's obviously mark to market on the notional value of these deals. Fine. But once it starts to chip away and really.

4:57

Speaker B

Well, no, as long as there's demand. I mean, listen, a lot of these sound bites have already become kind of cliches, but they're worth saying it. Right. During the Internet days, we were raising money to put fiber in the ground that wasn't used. That's a problem. Right? Because now you actually have a supply overhang. And even in the time of the Internet, the supply and bandwidth overhang, even as massive as it was, as massive as the crash was, only lasted about four years. But we don't have a supply overhang. There's no dark GPUs. Right. And so circular or not, if someone invests in a company, you know, they'll actually use the GPUs. And on the other side of it is the, is the actual customer. So I think it's a different time.

5:08

Speaker A

I think the other piece maybe just to add onto this, and I'm going to quote Martin in front of him, but this is probably also a unique time in that for the first time you can actually trace dollars to outcomes. Right. Provided that scaling laws are holding and capabilities are actually moving forward. Because if you can put, translate dollars into capabilities, a capability improvement, there's demand there, to Martin's point. But if that somehow breaks, you know, obviously that's an important assumption in this whole thing to make it work. But, you know, instead of investing dollars into sales and marketing, you're. You're investing into R and D to get to the capability, you know, increase. And that's sort of been the demand driver because once there's an unlock there, people are willing to pay for it.

5:53

Speaker D

Yeah. Is there any difference in how you build the portfolio now that some of your growth companies are like the infrastructure of the early stage companies like, you know, OpenAI is now at the same size as some of the cloud providers were early on, what does that look like? How much information can you feed off each other between the two?

6:33

Speaker B

There's so many lines that are being crossed right now or blurred. So we already talked about venture and growth. Another one that's being blurred is between infrastructure and apps. So what is a model company? It's clearly infrastructure because it's doing kind of core R and D. It's a horizontal platform, but it's also an app because touches the users directly. And then of course, the growth of these is just so high. And so I actually think you're just starting to see a new financing strategy emerge. And we've had to adapt as a result of that. And so there's been a lot of changes. You're right that these companies become platform companies very quickly. You've got ecosystem built out. So none of this is necessarily new. But the timescales in which it's happened is pretty phenomenal. And the. Where we normally cut lines before is blurred a little bit. But, but that, that, that said, I mean a lot of it also just does feel like things that we've seen in the past, like cloud build out and the Internet bailed out as well.

6:52

Speaker A

Yeah, yeah. I think it's interesting. I don't know if you guys would agree with this, but it feels like the emerging strategy is. And this builds off of your other question, you raise money for Compute, you pour that or you pour the money into Compute, you get some sort of breakthrough, you funnel the breakthrough into your vertically integrated application. That could be ChatGPT, that could be cloud code, you know, whatever it is, you massively gain share and get users, maybe you're even subsidizing at that point, depending on your strategy, you raise money at the peak momentum and then you repeat, rinse and repeat. And so, and that wasn't true even two years ago, I think. And so it's sort of to your just tying it to fundraising strategy. Right. There's a hiring strategy. All of these are tied. I think the lines are blurring even more today where everyone is. But of course these companies all have API businesses and so there are these frenemy lines that are getting blurred in that a lot of, I mean they have billions of dollars of API revenue. Right. And so there are customers there, but they're competing on the app layer.

7:52

Speaker B

Yeah. So this is a really, really important point. So I would say for sure, venture and growth, that light is blurry. App and infrastructure, that line is blurry. But I don't think that changes our practice so much. But like where they're very open. Questions are like, does this layer in the same way compute traditionally has like during the cloud is like, you know, like whatever, somebody wins one layer, but then another whole set of companies wins another layer. But that might not be the case here. It may be the case that you actually can't verticalize on the token string. Like you can't build an app. Like it necessarily goes down just because there are no abstractions. So those are kind of the bigger existential questions we ask. Another thing that is very different this time than in the history of computer science is in the past, if you raised money, then you basically had to wait for engineering to catch up, which famously doesn't scale like the mythical mammoth. It took a very long time. But that's not the case here. A model company can raise money and drop a model in a year and it's better. Right. And it does it with a team of 20 people or 10 people. So this type of money entering a company and then producing something that has demand and growth right away and using that to raise more money is a very different capital flywheel than we've ever seen before. And I think everybody's trying to understand what the consequences are. So I think it's less about like big companies and growth and this and more about these more systemic questions that we actually don't have answers to.

8:52

Speaker D

Yeah, like at Kernel Labs, one of our ideas is like if you had unlimited money to spend productively to turn tokens into products. Like the whole early stage market is very different because today you're investing x amount of capital to win a deal because of price structure and whatnot. And you're kind of pot committing to a certain strategy for a certain amount of time. But if you could like iteratively spin out companies and products and just throw, I want to spend a million dollar of inference today and get a product out tomorrow. We should get to the point where the friction of token to product is so low that you can do this and then you can change the early stage venture model to be much more iterative. And then every round is like either 100k of inference or like 100 million from a 16z. There's no like a million dollar C round anymore.

10:17

Speaker B

But there's a there a the an industry structural question that we don't know the answer to, which involves the frontier models, which is let's take anthropic. Let's say anthropic has a state of the art model that has some large percentage of market share. And let's say that a company is building smaller models that use the bigger model in the background. Opra 4.5, but they add value on top of that. Now, if Anthropic can raise three times more every subsequent round, they probably can raise more money than the entire app ecosystem that's built on top of it. And if that's the case, they can expand beyond everything built on top of it. It's like. Imagine like a star that's just kind of expanding. So there could be a systemic. There could be a systemic situation where the soda models can raise so much money that they can outpay anybody that builds on top of them, which would be something I don't think we've ever seen before, just because we were so bottled up on engineering. And it's a very question.

11:06

Speaker E

Yeah, it's almost like bitter lesson applied to the startup industry.

12:09

Speaker B

100%. Yeah. It literally becomes an issue of, like, raise capital, turn that directly into growth, use that to raise three times more. And if you can keep doing that, you literally can outspend any company that's built. Not any company. You can outspend the aggregate of companies on top of you, and therefore you'll necessarily take their share, which is crazy.

12:12

Speaker E

Would you say that kind of happens a character? Is that the sort of postmortem on what happened?

12:30

Speaker D

No.

12:38

Speaker A

Yeah, because I think.

12:40

Speaker E

I mean, the actual postmortem is he wanted to go back to Google, but,

12:41

Speaker B

like, that's another difference.

12:45

Speaker A

You said it.

12:47

Speaker B

We should actually talk about this.

12:49

Speaker E

Go for it.

12:50

Speaker B

Take it over.

12:51

Speaker A

I was gonna say, I think the character thing raises actually a different issue, which actually the Frontier Labs will face as well. So we'll see how they handle it. But so we invested in character in January 2023, which feels like eons ago. I mean, three years ago feels like lifetimes ago, but. And then they did the IP licensing deal with Google in August 2024. And so, you know, at the time, Gnome, you know, he's talked publicly about this, right? He wanted to. Google wouldn't let him put out products in the world. That's obviously changed drastically. But he went to go do that, but he had a product attached. The goal was, oh, I mean, it's Noam Shazir. He wanted to get to AGI. That was always his personal goal. But, you know, I think through collecting data, right. This sort of very human use case that the Character product originally was and still is, was one of the vehicles to do that. I think the real reason that, you know, if you think about the stress that any company feels before you ultimately go on one way or the other is sort of this AGI versus product. And I think a lot of the big, I think, you know, OpenAI is feeling that anthropic. If they haven't started, you know, felt it, certainly given the success of their products, they may start to feel that soon. And their reality, I think there's real trade offs. Right. It's like how many, when you think about GPUs, that's a limited resource, where do you allocate the GPUs? Is it toward the product? Is it toward new research? Right. Is it long term research? Is it toward near to midterm research? And so in a case where you're resource constrained, of course there's this fundraising game you can play. Right. But the market was very different back in 2023 too. I think the best researchers in the world have this dilemma of, okay, I want to go all in on AGI, but it's the product usage revenue flywheel that keeps the revenue in the house to power all the GPUs to get to AGI. And so it does make, you know, I think it sets up an interesting dilemma for any startup that has trouble raising up until that level. Right. And certainly if you don't have that progress, you can't continue this fundraising flywheel,

12:52

Speaker B

I would say, because we're keeping track of all of the things that are different. Right. Like venture growth and app infra. And one of the ones is definitely the personalities of the founders. It's just very different this time. I mean, I've been doing this for a decade and I've been doing startups for 20 years. And so, I mean, a lot of people start this to do AGI and we've never had like a unified North Star that I recall. In the same way, like people built companies to start companies in the past. Like that was what it was like. I would create an Internet company, I would create an infrastructure company. Like it's kind of more engineering builders. And this is kind of a different, you know, mentality. And some companies have harnessed that incredibly well because their direction is so obviously on the path to what somebody would consider AGI, but others have not. And so like there is always this tension with personnel. And so I think we're seeing more kind of founder movement.

15:00

Speaker D

Yeah.

15:54

Speaker B

You know, as a fraction of founders than we've ever seen. And maybe since like, I don't know, the time of like shockley and the Traders 8 or something like that, way back to the beginning of the industry. I mean, it's a very, very unusual type of personnel.

15:55

Speaker A

Totally. And I think it's exacerbated by the fact that talent wars. I mean, every industry has talent wars, but not at this magnitude. Very rarely can you see someone get poached for $5 billion. That's hard to compete with. And then secondly, if you're a founder in AI, you could fart and it would be on the front page of, you know, the information these days. And so there's sort of this fishbowl effect that I think adds to the deep anxiety that, that these AI founders are feeling.

16:07

Speaker B

Yes.

16:35

Speaker E

I mean, just briefly comment on the founder, the sort of talent wars thing. I feel like 2025 was just like a blip. Like, I don't know if we'll see that again because Meta built the team. I don't know if I think they're kind of done. And who's going to pay more than Meta? I don't know.

16:35

Speaker B

I agree. It feels this way to me too. It was like basically Zuckerberg kind of came out swinging and then now he's kind of back to building.

16:51

Speaker E

Yeah, yeah. You know, you gotta like, pay up to like assemble team to rush the job, whatever. But then now, now you, like, you made your choices and now they gotta ship. Right.

16:58

Speaker B

Like the other side of that is like, you know, like we're, we're actually in the job hiring market. We've got 600 people here. I hire all the time. I've got three open recs if anybody's interested that's listening to this investor. Yeah. On the team, like on the investing side of the team, like, and a lot of the people we talk to have acting, you know, active offers for 10 million a year or something like that. And like, you know, and we pay really, really well. And just to see what's out on is really, is really remarkable. And so I would just say it's actually. So you're right. Like the really flashy one, like I will get someone for, you know, a billion dollars, but like the inflated trickles down. Yeah, it's still very active today.

17:05

Speaker A

I mean, yeah, you could be an L5 and get an offer in the tens of millions.

17:45

Speaker B

Yeah, easily.

17:50

Speaker A

So I think you're right that it felt like a blip. I hope you're right. But I think it's been the steady state is nothing got pulled up.

17:51

Speaker B

Yeah, yeah, exactly. For sure. Yeah, yeah.

17:58

Speaker D

And I think that's breaking the early stage founder math too. I think before a lot of people were like, well, maybe I should just go be a founder instead of like getting paid 800k a million at Google. But if I'm getting paid 5, 6 million, that's different.

18:00

Speaker B

But on, but on the other hand, there's more strategic money than we've ever seen historically. Right. And so the economics, the calculus on the economics is very different in a number of ways and it's caused a, a ton of change and confusion in the market. Some very positive, sub negative like so for example, the other side of the, the co founder like acquisition, you know, Mark Zuckerberg, poaching someone for a lot of money is like we're actually seeing historic amount of M and A for basically aqua hires. Right. That you like, you know, really good outcomes from a venture perspective that are effective Aqua hires. Right. So I would say it's probably net positive from the investment standpoint even though it seems from the headlines to be very disruptive in a negative way.

18:12

Speaker D

Yeah, let's talk maybe about what's not being invested in. Like maybe some interesting ideas that you will see more people build or it seems in a way, you know, as YC is getting more popular, as like X is getting more popular, there's a startup school path that a lot of founders take and they know what's hot in the VC circles and they know what gets funded and there's maybe not as much risk appetite for things outside of that. I'm curious if you feel like that's true and what are maybe some of the areas that you think are under discussed?

19:00

Speaker B

I mean, I actually think that we've taken our eye off the ball in a lot of like just traditional, you know, software companies. So like, I mean, you know, I think right now there's almost a barbell. Like you're like the hot thing on X, you're deep tech.

19:34

Speaker D

Right.

19:50

Speaker B

But I feel like there's just kind of a long list of like good, good companies that will be around for a long time in very large markets. So you're building a database, you know, say you're building, you know, kind of monitoring or logging or tooling or whatever. There's some good companies out there right now, but like they have a really hard time getting the attention of investors and it's almost become a meme, right? Which is like if you're not basically growing from 0 to 100 in a year, you're not interesting. Which is just the silliest thing to say. I mean, think of yourself as like an individual person, like, like your personal money. Right. So your will you put it in the stock market at 7% or you put it in this company growing 5x in a very large market. Of course you can put it in the company 5x. So it's just like we say these stupid things like if you're not going from 0 to 100, but like those, like who knows what the margins of those are. Clearly these are good investments for anybody. Right. Like our LPs want whatever 3x net over, you know, the life cycle of a fund. Right. So a company in a big market going 5x is a great investment. Everybody would be happy with these returns. But we've got this kind of mania on these strong growths. And so I would say that that's probably the underinvested sector right now.

19:51

Speaker E

Boring software, Boring enterprise software.

20:57

Speaker B

Just traditional, like really good company.

20:58

Speaker E

No A.I. here.

21:01

Speaker B

No, like, well, well, the A.I. of course is pulling them into use cases. But that's not what they are. They're not on the token path. Right. Let's just say that like they're softer but they're not on the token path. Like these are like, they're great investments from any definition, except for like random VC on Twitter saying VC on X saying like it's not growing fast enough. What do you think?

21:02

Speaker A

Maybe I'll answer a slightly different question, but adjacent to what you asked, which is maybe an area that we're not investing right now that I think is a question and we're spending a lot of time in regardless of whether we pull the trigger or not. And it would probably be on the hardware side actually. Right. And the robotics. Right. Which is, it's. I don't want to say that it's not getting funding because it's clearly it's, it's sort of non consensus to almost not invest in robotics at this point. But we spent a lot of time in that space and I think for us we just haven't seen the ChatGPT moment happen on the hardware side. And the funding going into it feels like it's already taking that for granted.

21:20

Speaker B

Yeah. But we also went through the drone.

21:58

Speaker E

There's a zipline right out there.

22:02

Speaker B

Was that?

22:03

Speaker A

Oh yeah, there's a zipline. Yeah.

22:04

Speaker B

What's the AV era? And one of the takeaways is when it comes to hardware, most companies will end up verticalizing. Like if you're, if you're investing in a robot company for an at for agriculture, you're investing in an ag company because that's the competition and that's surprising and that's supply chain and if you're doing it for mining. That's mining. And so the AD team does a lot of that type of stuff because they're actually set up to diligence that type of work. But for like horizontal technology investing, there's very little when it comes to robots, just because it's so fit for purpose. And so we kind of like to look at software solutions or horizontal solutions, like applied intuition, clearly from the AV wave. Deep map, clearly from the AV wave. I would say scale AI was actually a horizontal one for, you know, for robotics early on. So that sort of thing, we're very, very interested. But the actual, like, robot interacting with the world is probably better for a different team. Yeah, I agree.

22:06

Speaker D

Yeah. I'm curious who these teams are supposed to be that invest in them. I feel like everybody's like, yeah, robotics, it's important and like, people should invest in it. But then when you look at like the numbers, like the capital requirements early on versus, like the moment of, okay, this is actually going to work, let's keep investing, that seems really hard to predict in a way that it's not.

22:57

Speaker B

I mean, CO2, Khosla, GC. I mean, these are all invested in hardware companies. And listen, I mean, it could work this time for sure, right? I mean, if Elon's doing it, he's like, just the fact that Elon's doing it means that there's going to be a lot of capital and a lot of attempts for a long period of time. So that alone maybe suggests that we should just be investing in robotics. Just because you have this North Star who's Elon with a humanoid, and that's going to basically will into being an industry. But we've just historically found, like, we're a huge believer that this is going to happen. We just don't feel like we're in a good position to diligence these things. Because again, robotics companies tend to be vertical. You really have to understand the market they're being sold into. Like, that's like that competitive equilibria with a human being is what's important. It's not like the core tech. And like we're kind of more horizontal cortech type investors. And this is Sarah and I, the ad team is doing. They can actually do these types of things.

23:17

Speaker E

Just to clarify, AD stands for American Dynamism. All right. Yeah, yeah. I actually, I do have a related question. First of all, I want to acknowledge also just on the, on the chips side, I recall a podcast that, where you were on, I think it was the ASICS and C podcast about two or three years ago where you, where you suddenly said something which really stuck in my head about how at some point, at some point, kind of scale, it makes sense to build a custom ASIC for per run.

24:10

Speaker B

Yes. It's crazy.

24:35

Speaker D

Yeah.

24:36

Speaker E

I think you estimated 500 billion something.

24:37

Speaker B

No, no, no. A billion dollar training run. A $1 billion training run. It makes sense to actually do a custom ASIC if you can do it in time. The question now is timeline, not money because just, just, just rough math.

24:40

Speaker A

Math.

24:50

Speaker B

If it's a billion dollar training run, then the inference for that model has to be over a billion, otherwise it won't be solvent. So let's assume it's. If you could save 20%, which you save much more than that with an ASIC. 20%, that's $200 million. You can tape out a chip for $200 million. Right. So now you can literally like justify economically, not timeline wise. That's a different issue. An ASIC per model, which is because

24:50

Speaker E

that's how much we leave on the table every single time we do like generic Nvidia.

25:12

Speaker B

Yeah, exactly, exactly. No, it's actually much more than that. You could probably get a factor of two, which would be $500 million.

25:16

Speaker E

Typical MFU would be like 50. And that's good.

25:22

Speaker B

Exactly. Yeah, 100.

25:25

Speaker E

So yeah. And I just want to acknowledge like here we are end 2025 and OpenAI is confirming like Broadcom and all the other custom silicon deals, which is incredible. I think that speaking about ad, there's a really interesting tie in that obviously you guys are hit on which is this sort of like America first movement or re. Industrialize here and move TSMC here, if that's possible. How much overlap is there from AD to I guess growth and investing in particularly US AI companies that are strongly bounded by their compute.

25:26

Speaker B

Yeah. So I would view AD as more as a market segmentation than like a mission. Right. So the market segmentation is it has kind of regulatory compliance issues or government sale or deals with like hardware. I mean, they're just set up to, to, to diligence those types of companies. So it's more of a market segmentation thing. I would say the entire firm, you know, which has been, since it's been incepted, you know, has geographical biases.

26:00

Speaker E

Right.

26:26

Speaker B

I mean, for the longest time, like, you know, Bay Area is going to be like where the majority of the dollars go.

26:26

Speaker E

Yeah.

26:30

Speaker B

And, and listen there, there's actually a lot of compounding effects for having a geographic bias.

26:31

Speaker C

Right.

26:34

Speaker B

You know, everybody's in the same place. You've got an ecosystem, you're there, you've got presence, you've got a network. And I mean, I would say the Bay Area is very much back, you know, like, I remember during pre Covid, like, it was like almost crypto, had kind of pulled startups.

26:35

Speaker E

Miami.

26:49

Speaker B

Yeah, yeah. New York was, you know, because it's so close to finance. Came out like Los Angeles had a moment because it was so close to consumer, but now it's kind of come back here. And so I would say, you know, we tend to be very Bay Area focused historically, even though of course, we invest all over the world. And then I would say, like, if you take the ring out, you know, one more, it's going to be the U.S. of course, because we know very well. And then one ring more is going to be us and its allies and. Yeah. And it goes from there. Yeah. Sorry.

26:50

Speaker A

No, no, I agree. I think from a. But I think from the intern that that's sort of like where the companies are headquartered. Maybe your questions on supply chain and customer base. I would say our customers are. Our companies are fairly international from that perspective. Like, they're selling globally. Right. They have global supply chains in some cases.

27:14

Speaker B

I would say also the stickiness is very different.

27:30

Speaker A

Yeah.

27:32

Speaker B

Historically between venture and growth. Like, there's so much company building adventure. So much so like hiring the next pm, introducing the customer, like all of that stuff. Like, of course we're just going to be stronger where we have our network and we've been doing business for 20 years. I've been in the Bay Area for 25 years, so clearly I'm just more effective here than I would be somewhere else where I think. I think for some of the later stage rounds, the companies don't need that much help. They're already kind of pretty mature historically, so they can kind of be everywhere. So there's kind of less of that stickiness. This is definitely in the AI time. I mean, Sarah is now the chief of staff of like half the AI companies in the Bay Area right now. She's like, like ops, Ninja biz, dev biz ops.

27:33

Speaker E

Are you finding much AI automation in your work? What is your stack?

28:16

Speaker A

Oh, in my personal stack.

28:21

Speaker E

I mean, because, like, by the way, the reason for this is it's triggering. Yeah. I'm hiring ops people. A lot of founders I know are also hiring ops people. And I'm just, you know, it's opportunity. Since you're also like, basically helping out with ops with a lot of Companies. What are people doing these days? Because it's still very manual, as far as I can tell.

28:22

Speaker A

Yeah, I think the things that we help with are pretty network based in that it's sort of like, hey, how do I shortcut this process? Well, let's connect you the right person. So there's not quite an AI workflow for that. I will say as a growth investor, Claude Cowork is pretty interesting. Like for the first time you can actually get one shot data analysis. Right. Which, you know, if you're going to do a customer database, analyze a cohort retention. Right. That's just stuff that you had to do by hand before. And, and our team the other. It was like midnight and the three of us were playing with Claude Cowork. We gave it a raw file. Boom. Perfectly accurate. We checked the numbers. It was amazing. That was my like aha moment. That sounds so boring. But you know, that's, that's the kind of thing that a growth investor is like, you know, slaving away on late at night, done in a few seconds.

28:41

Speaker B

Yeah.

29:31

Speaker E

You gotta wonder what the whole like Anthropic Labs, which is like their new sort of product studio, what would that be worth as an independent startup?

29:32

Speaker B

You know, like a lot. Yeah, true. Yeah. You got to hand it to them. They've been executing incredibly well.

29:41

Speaker E

Yeah. I mean to me, like, you know, anthropic, like building on cloud code. I think it makes sense to me. The real pedal to the metal, whatever the phrase is, is when they start coming after consumer with against OpenAI and like that is like red alert at OpenAI.

29:47

Speaker B

Oh, I think they've been pretty clear they're enterprise focused.

30:03

Speaker E

They have been. But like here's publicly, it's enterprise focus, it's coding. Right. But here's cloud cowork and here's like. Well, apparently they're running Instagram ads for cloud AI on, you know, for people. Right. And so like it's kind of like this disruption thing of, you know, OpenAI has been doing consumer, been doing just pursuing general intelligence in every modality. And here's Anthopic. They only focus on this thing, but now they're sort of undercutting and doing the whole, whole innovator's dilemma thing on like everything else.

30:05

Speaker B

It's very interesting. Yeah. I mean there's a very open question. So for me there's like, do you know that meme where there's like the guy in the path and there's like a path this way, There's a path

30:39

Speaker E

this way, like One which way, Western man?

30:47

Speaker B

Yeah, yeah, yeah, yeah. And for me, like, like all the entire industry kind of like hinges on like two potential futures. So in, in one potential future, the market is infinitely large. There's perverse economies of scale because as soon as you put a model out there, it kind of sublimates and all the other models catch up and it's just like software's being rewritten and fractured all over the place and there's tons of upside and it just grows. And then there's another path which is like, well, maybe these models actually generalize really well and all you have to do is train them with three times more money. That's all you have to do. And it'll just consume everything beyond it. And if that's the case, you end up with basically an oligopoly for everything. Like, you know, because they're perfectly general and like, so this would be like the, the AGI path would be like, these are perfectly general, they could do everything. And this one is like, this is actually normal software. The universe is complicated. You've got. And nobody knows the answer. My belief is if you actually look at the numbers of these companies, so generally if you look at the numbers of these companies, if you look at like the amount they're making and how much they, they spent training the last model, their gross margin positive, you're like, oh, that's really working. But if you look at that like the current training that they're doing for the next model, their gross margin negative. So part of me thinks that a lot of them are kind of borrowing against the future and that's going to have to slow down. That's going to catch up to them at some point in time, but we don't really know. Yeah.

30:49

Speaker E

Does that make sense?

32:15

Speaker B

Like, I mean, it could be, it could be the case that the only reason this is working is because they can raise that next round and they can train that next model because these models have such a short life. And so at some point in time, like, you know, they won't be able to raise that next round for the next model. And then things will kind of converge and fragment again. But right now it's not totally, I

32:16

Speaker A

think the other, by the way, just a meta point. I think the other lesson from the last three years is, and we talk about this all the time because we're on this Twitter X bubble. But you know, if you go back to, let's say, March 2024, that period, it felt like a, I think an open source model with an Like a, you know, benchmark leading capability was sort of launching on a daily basis at that point. And, and so that, you know, that's one period. Suddenly it's sort of like open source takes over the world. There's going to be a plethora. It's not an oligopoly. You know, if you fast, you know, if you, if you rewind time. Even before that, GPT4 was number one for nine months, 10 months. It's a long time. Right. And of course now we're in this era where it feels like an oligopoly, maybe some very steady state shifts and you know, it could look like this in the future too, but it just, it's so hard to call. And I think the thing that keeps, you know, us up at night in a good way and bad way is that the capability progress is actually not slowing down. And so until that happens. Right. Like you don't know what's going to look like.

32:33

Speaker B

But I would say for sure it's not converged. Like for sure. Like the systemic capital flows have not converged. Meaning right now it's still borrowing against the future to subsidize growth currently, which you can do that for a period of time. But you know, at the end, at some point the market will rationalize it and just nobody knows what that will look like. Like, yeah, or like the drop in price of compute will save them. Who knows?

33:37

Speaker D

Yeah, yeah. I think the models need to asymptote to specific tasks. You know, it's like, okay, now opus 4.5 might be AGI at some specific task and now you can like depreciate the model over a longer time. I think now, right now there's like no old model.

34:02

Speaker B

No, but let, but let me just change that mental, that used to be my mental model. Let me just change it a little bit. If you can raise three times, if you raise more than the aggregate of anybody that uses your models, that doesn't even matter. It doesn't even matter. See what I'm saying? So I have an API business. My API business is 60% margin or 70% margin or 80% margin. It's a high margin business. So I know what everybody is using. If I can raise more money than the aggregate of everybody that's using it, I will consume them whether I'm AGI or not. And I will know that they're using it because they're using it. And like unlike in the past where engineering stops me from doing that, this is very straightforward. You just train. So I also Thought it was kind of like. You must asymptote AGI. General. General. But I think there's also just a possibility that the capital markets will just give them the, the, the ammunition to just go after everybody on top of them.

34:15

Speaker A

I, I do wonder though, to your point, if there's a certain task that getting marginally better isn't actually that much better. Like we've asymptoted to, you know, we can call it AGI or whatever. You know, actually Ali Goatsi talks about this like we're already at AGI for a lot of functions in the enterprise. That's probably though for those tasks, you probably could build very specific companies that focus on just getting as much value out of that task that isn't coming from the model itself. There's probably a rich enterprise business to be built there. I mean, could be wrong on that. But there's a lot of interesting examples. So. Right. If you're looking at the legal profession or whatnot, and maybe that's not a great one because the models are getting better on that front too, but just something where it's a bit saturated, then the value comes from services, it comes from implementation. Right. It comes from all these things that actually make it useful to the end customer.

35:03

Speaker B

Sorry, where am I? One more thing I think is, is under discussed in all of this is like to what extent every task is AGI complete. I code every day. It's so fun.

35:52

Speaker A

That's a poor question. Yeah.

36:03

Speaker B

And like when I'm talking to these models, it's not just code. I mean, it's everything. Right. Like I, you know, like it's, it's healthcare.

36:04

Speaker E

I mean, it's legal, but it's every.

36:12

Speaker B

It's exactly that. Like, I mean, it's everything. Like I'm asking these models to. Yeah. To understand compliance. I'm asking these models to go search the web. I'm asking these models to talk about things I know in the history. Like that's having a full conversation with me while I engineer. And so it could be the case that the most AGI complete. I'm not an AGI guy. I think that's. But the most AGI complete model will always win independent of the task. And we don't know the answer to that one either. But it seems to me that, listen, Codex in my experience is for sure better than Opus 4.5 for coding. It finds the hardest bugs that I work in with. Know the smartest developers I know work on it. It's great. But I think Opus 4.5 is actually very. It's got a great bedside manner. And it really, it really matters if you're building something very complex because, like, it really, you know, like you're, you're. You're a partner and a brainstorming partner for somebody. And I think we don't discuss enough how every task kind of has that quality. And what does that mean to like, capital investment and like frontier models and sub models. Yeah, like, what happened to all the special coding models? Like, none of them worked. Right.

36:14

Speaker A

Right.

37:18

Speaker D

So some of them, they didn't even get released.

37:19

Speaker E

Magic.dev or there's a whole.

37:21

Speaker B

There's a whole host. We saw a bunch of them. And like, there's this whole theory that, like, there could be a. And I think one of the conclusions is, is, like, there's no such thing as a coding model. You know, like, that's not a thing. Like you're talking to another human being. And it's, it's good at coding, but, like, it's got to be good at everything.

37:23

Speaker E

Minor disagree. Only because I, I'm pretty. I have pretty high confidence that basically OpenAI will always release a GPT5 and a GPT5 codex. Like, like, that's the code. The way I call it is one for Riz and one for Tiz. And then like someone internal open air was like, yeah, that's a good way to frame it.

37:38

Speaker B

That's so funny.

38:00

Speaker E

But maybe it collapses down to reason tis and that's it. It's not like 100 dimensions, it's two dimensions. Like, in exactly beside manner versus coding.

38:01

Speaker B

I think for anybody listening to this or for you, like, when you're like, coding or using these models for something like, like actually just like, be aware of how much of the interaction has nothing to do with coding and it just turns out to be a large portion of it. And so, like, you're, I think, like, like the best Soto ish model, you know, is going to remain very important no matter what the task is.

38:13

Speaker E

Yeah. Speaking of coding, I'm going to be cheeky and ask, say, what actually are you coding? Because obviously you could code anything and you're obviously a busy investor and a manager of like a giant team. What are you going to.

38:34

Speaker B

I help Fei Fei at World Labs. It's one of the investments and they're building a foundation model that creates 3D scenes.

38:46

Speaker E

Yeah, we had our own pod.

38:55

Speaker B

Yeah. Yeah. And so these 3D scenes are Gaussian splats, just by the way that kind of AI works. And so, like, you can reconstruct a scene better with radiance feels than with meshes because they don't really have topology. So they produce these just beautiful 3D rendered scenes that are Gaussian splats. But the actual industry support for Gaussian splats isn't great. It's just never, you know, it's always been meshes and like things like unreal use meshes. And so I work on a open source library called Spark JS, which is a JavaScript rendering library ready for Gaussian splats. And it's just because, you know, you need that support. And right now there's kind of a 3js moment that's all meshes. And so like it's become kind of the default in 3js ecosystem. As part of that, to kind of exercise the library, I just build a whole bunch of cool demos. So if you see me on X, you see like all my demos and all the world building, but all of that is just to exercise this, this library that I work on. Because it's actually a very tough algorithmics problem to actually scale a library that much. And just so you know, this is ancient history now, but 30 years ago I paid for undergrad, you know, working on game engines in college in the late 90s. So I've got actually a back. It's very old back. I actually have a background in this and so a lot of it's fun, you know, but. But the, the. The whole goal is just for this rendering library to.

38:56

Speaker A

Are you one of the most active contributors their GitHub, Spark JS.

40:15

Speaker B

Yeah, there's only two of us, so.

40:19

Speaker E

Yes.

40:22

Speaker B

So by the way, so the. Yeah, yeah. So the primary developer is a guy named Andreas Sundquist, who's an absolute genius. He and I did our PhDs together. And so like we said it for comps and quality. It's almost like hanging out with an old friend, you know. And so like, so he's the core, core guy. I did mostly kind of, you know, listening. But you're fun.

40:24

Speaker E

It's amazing. Like five years ago you would not have done any of this. And it brought you back the active.

40:42

Speaker B

The active energy was so high because you had to learn all the framework. Bullshit, man. I fucking used to hate that. And so like, now I don't have to deal with that. I can like focus on the algorithmics, I can focus on the scaling. Yeah, yeah.

40:47

Speaker E

And then I'll observe one irony and then I'll ask a serious investor question, which is like the irony is Fei. Fei actually doesn't believe that LLMs can lead us to spatial intelligence. And here you are using LLMs to, like, help, like, achieve spatial intelligence. I just see. I see some, like, disconnect in there.

40:57

Speaker B

Yeah, yeah. So I think, I think, you know, I think. I think what she would say is LLMs are great to help with coding.

41:13

Speaker E

Yes.

41:18

Speaker B

But, like, that's very different than a model that actually, like, provides.

41:19

Speaker E

They'll never have the spatial intelligence.

41:22

Speaker B

Listen, our brains clearly. Listen, our brains clearly have both. Our brains clearly have a language reasoning section. They clearly have a spatial reasoning section. I mean, it's just, you know, these are two pretty independent problems.

41:24

Speaker E

Okay. And, like, I would say that the one data point I recently had against it is, is the DeepMind, IMO, gold, where so typically the typical answer is that this is where you start going down the neurosymbolic path.

41:34

Speaker B

Right.

41:49

Speaker E

Like one sort of vague, sort of abstract reasoning thing and one formal thing. And that's what DeepMind had in 2024 with Alpha Proof, alpha geometry. And now they just use DeepThink and just extended thinking tokens. And it's one model and it's in LLM.

41:49

Speaker A

Yeah.

42:04

Speaker E

And so that was my indication of, like, maybe you don't need a separate system.

42:05

Speaker B

System. Yeah. So let me step back. I mean, at the end of the day, at the end of the day, these things are like nodes in a graph with weights on them. Right. You know, like, it can be models. You distill it down. But let me just talk about the two different substrates. Let me put you in a dark room, like, totally black room, and then let me just describe how you exit it. Like, to your left, there's a table, like, duck below this thing. Right. I mean, like, the chances that you're going to, like, not run into something are very low. Low. Now, let me, like, turn on the light and you actually see and you can do distance and you know, how far something away is and, like, where it is or whatever, then you can do it. Right. Like, language is not the right primitives to describe the universe because it's not exact enough. So that's all Fei. Fei is talking about when it comes to, like, spatial reasoning is, like, you actually have to know that this is three feet far, like, that far away. It is curved. You have to understand, you know, like, the actual movement through space.

42:09

Speaker D

Yeah.

43:08

Speaker B

So I do. Listen, I do think at the end of these models are definitely converging as far as models. But there's, there's. There's different representations of problems. You're Solving one is language, which, you know, that would be like describing to somebody, like, what to do, and the other one is actually just showing them. And the spatial reasoning is just showing them. Yeah, yeah, yeah, right. I got it. Got it.

43:08

Speaker E

The in the investor question was on World Labs is, well, like, how do I value something like this? What worked as the do you do? I'm just like, Fei. Fei is awesome. Justin's awesome. And, you know, the other two co founder. Co founders. But like, the. The. The tech. Everyone's building cool tech. But, like, what's the value of the tech? And this is the fundamental question.

43:25

Speaker B

Let me just. For, like, let me just maybe give you a rough sketch on the diffusion models. I actually love to hear Sarah because I'm a venture person, you know, so, like, venture is always like kind of wild west.

43:44

Speaker E

You paid a dream. And she has to like.

43:52

Speaker B

Actually, I'm gonna say I've marked her reality.

43:54

Speaker A

Exactly.

43:57

Speaker B

So I'm gonna say the venture viewer. And she can be like, okay, your little kid. Yeah. So. So, like, so. So these diffusion models literally create something for. For almost nothing and something that the. The world has found to be very valuable in the past in our real markets, right? Like. Like a 2D image. I mean, that's been an entire market. People value them. It takes a human being a long time to create it, right? I mean, to create a. You know, to turn me into a whatever. Like, an image would cost a hundred bucks in an hour. The inference costs a hundredth of a penny, right? So we've seen this with speech in very successful companies. We've seen this with 2D image. We've seen this right? Now think about 3D scene.

43:58

Speaker E

I mean.

44:36

Speaker B

I mean, when's Grand Theft auto coming out? 6. What, it's been 10 years. I mean, how. Like, how much would it cost to, like, to reproduce this room in 3D? If you.

44:36

Speaker E

If you.

44:48

Speaker B

If you hire somebody on Fiverr in any sort of quality, probably 4,000 to $10,000. And then if you had a professional, probably $30,000. So if you could generate the exact same thing from a 2D image, and we know that these are used. And they're using Unreal, and they're using Blender, they're using movies, and they're using video games, and they're using all. So if you could do that for. For, you know, less than a dollar, that's four or five orders of magnitude cheaper. So you're bringing the marginal cost of something that's useful down by three orders of magnitude, which historically have created Very large companies. So that would be like the venture kind of strategic dreaming map.

44:48

Speaker E

Yeah. And for listeners, you can do this yourself on your own phone with like the marble. But also there's many Nerf apps where you just go on your iPhone and do this.

45:17

Speaker B

Yeah, yeah, yeah. And in the case of marble though, what you do is you literally give a it in. So most Nerf apps, you like kind of run around and take a whole bunch of pictures and then you kind of reconstruct it. Yeah, things like marble, just the whole generous 3D space will just take a 2D image and it'll reconstruct all the

45:27

Speaker E

like, meaning it has to fill in

45:44

Speaker B

the back of the table, under the table, like the images it doesn't see. So the generator stuff is very different than reconstruction that it fills in the things that you can't see. Yeah. Okay, so all right, so now.

45:47

Speaker A

No, no, I mean, I love that

45:55

Speaker B

from the adult perspective.

45:56

Speaker A

Well, no, I was going to say these are very much a tag team. So we started this pod with that premise and I think is a perfect question to even build on that further because it truly is. I mean, we're tag teaming all of these together. But I think every investment fundamentally starts with the same, maybe the same two premises. One is at this point in time, we actually believe that There are n of 1 founders for their particular craft and they have to be demonstrated in their prior careers. Right. So we're not investing in every, you know, now the term is Neolab, but every foundation model, any, any company, any founder is trying to build a foundation model. We're not, not contrary to popular opinion. We're not invested in all of them. Right. We have a very specific thesis.

45:57

Speaker E

I don't think people say that about you. No, they don't. They don't.

46:37

Speaker A

They say that we're big, we're at everything. But you know, if you think about Ilya, right, he's at ssi. He's sort of been behind almost every foundational breakthrough for the last 15 years. If you think about, you know, the Thinking Machines team, right. Amira and John, right. John is the godfather of reinforcement learning. And so I go through this because, because if you think about, for each of the bets that we've made, it goes back to a very specific thesis about that person, the team they've assembled and what they've done in a prior life. And I think obviously we talked about talent wars. We do think at this particular moment in time there are particular people that can move needles. Clearly other companies believe that too. Otherwise they wouldn't be willing to pay such crazy prices for single individual. So that's one and then two. We don't think it's a zero sum game. Right. Like if that were true, OpenAI or actually just DeepMind would be number one in everything, right? There's clear value to specialization. It's like 11 labs. There have been so many audio models that have hit the market, they're still freaking number one, right? And so if you think about. And they've created a ton of value for their customers, for their investors, you know, for their team. And so if you think about those two put together, right, that's sort of the foundation of our thesis. When we back these foundation model companies, of course the valuations, you know, they sound astronomical when you think about current revenue, the numbers, you know, there's, there's sort of, I would one, I would say that's the market out there because they are raising larger dollars. They have compute needs, right. That's 80% of a round that they typically raise or typically of a round that they raise. But I think the thing that gets us excited about backing them is that the revenue growth has typically followed the capability breakthrough. So you sort of ties back to that question of the cyclical nature, like are you just funding it and you raise more funding when there's a real capability breakthrough, the demand is there and so the revenue growth is much faster than we've ever seen once it's turned on. There's a company, I can't share the name, but their product went ga in a few weeks. Tens of millions of revenue. Right.

46:40

Speaker E

We have, have, I've seen this myself.

48:50

Speaker A

Yes, absolutely. We have SaaS, companies that, you know, have been in business for seven years and they get to the same level seven years later and the growth is, you know, eking to whatever it is. And by the way, great companies, not at all diminishing what they've accomplished. But the fact is to get to that revenue growth that quickly, it's not just the two companies that people talk about. It's really a lot of these sort of, of every domain has a specialist. And we think if you can win that, you become very large very quickly. And that's actually played out in the numbers.

48:52

Speaker E

Yeah, our viewers are going to. So first of all, thank you for that overall take. I think it's important to hear you guys perspective because the rest of us are just kind of looking at headlines and not knowing how to make sense of any of this. Our listeners will roast us if we mention thinky and not discuss what happened. I mean obviously founder split happens but like I guess is the thesis on change is like, you know, like what's going on. I think.

49:24

Speaker A

Yeah, we're more excited than ever about them. They have some things that we're not going to do. Breaking news on a pod. You know, obviously they should share themselves but they've. You know, I think when you bring a team of that caliber together, there's special things that happen And I think 2026 is going to be a big year for them. Obviously some of the themes that we talked about before, even with just the media news storm, the whole something happens and then it's everywhere instantly. I think that's a tough situation for any company to be in. But to come out of that stronger than ever. I think that we're more bullish about thinke. Thank you. And you know, even before and obviously

49:53

Speaker E

and the story is tinker. It's custom models. RL. Yeah. Is that what we're aiming for?

50:41

Speaker A

Yeah. And a bunch of stuff we can't talk about here. Yeah. Cool. Yeah, absolutely. But no, that team is cooking and you know, I think they'll be just fine from. They'll recover from the events in January.

50:49

Speaker E

Yeah.

51:01

Speaker B

I will say this is the furthest. So we have a very privileged position on the boards of these companies. And like I will say I've never seen the perception of the truth be further from the truth industry wide ever. Like I, I guarantee you for any of these gossipy things, I guarantee you it's way off. Okay. Way, way. Like the general sentiment and like, and what happens is like we've got this crazy game of telephone right now where there's always like seeds of truth. But it gets so warped by the time like we hear all the time rumors about stuff that we're directly involved in. Like we're literally on the board. You know, like we're the one that did the thing and by the time it gets to us it's gotten so warped and so twisted. I think this is like everybody's excited. There's a lot of focus. The schadenfreude is so high that people just kind of will into being things that didn't exist. So I'm not, you know, I don't want to cover specifically on the thinking machines but, but like it's an important

51:02

Speaker E

message to the general audience.

52:00

Speaker B

I will tell you if you hear something on X like the chances that it's, you know, it is accurately representing what it's saying to is very, very low.

52:01

Speaker E

Yeah.

52:10

Speaker A

I have Never lost so much faith in the anon counts on Twitter that just seem very confident in what they're saying. And could it be further from the truth? I had a couple days stretch where I was like, oh, my God, Twitter is mind poison. And I love X, but we talk

52:11

Speaker B

to each other all the time because we actually know, because we're there. Like, we're there seeing these things. And, like, you know, Sarah will like, like, text me, you know, like, whatever. Like, it's like ridiculous. So for us, it's like, it's like this ridiculous. But the problem is, is we realize that things. Things start taking on a life of their own, and then people assume that they're real and. And everything. And so I think it's very tough for founders because, you know, it's tough enough fighting the real battle, you know? Absolutely. Now they're fighting phantoms too. And so, you know, you know, more and more we're just like. And I got this from the Cursor guys, which I really appreciate. Michael Troll, he's like, listen, heads down, down. Focus on the business. And Andy absolutely crushed it.

52:25

Speaker E

Yeah.

53:03

Speaker B

Yeah. And I. I think that's right. I think all Fox should do it right now because the noise is so hot.

53:03

Speaker A

Yeah, no, that team's been back to business for. For weeks, the thinky team. So.

53:08

Speaker D

Yeah. Yeah.

53:11

Speaker E

Well, thank you for indulging in that. It is just the hot topic of the moment. We gotta. Gotta address the elephant in the room. Cursor. Right. Obviously, you guys are big investors. 2025, I would say, is Cursor's year. I mean, maybe decade, but just like, I think, you know, just going back to the discuss AGI would just kind of consume everything. Cursor is like the one, like the kind of the shining example of like, here's how you build application layer that's a wrapper, but an extremely damn good one. And I guess just the general analysis, I guess, of Cursor's development and what it means for everyone. Is there a cursor in every industry to be built?

53:12

Speaker B

Yeah. So the interesting thing about cursors, they actually, for a small fraction of the cost, 100 the cost or less, developed an almost soda model, which for a period of time was the most popular coding model in the world. Right. Which is really crazy to think about. So I think they're just kind of doing it in reverse. Right. So there's two approaches. You start with a foundation model, and then you verticalize up, or you start with the app and all of the product data, and you Go down and they're the ones that are doing that. I think any company that's doing an app has to ask the margin question, which is like, how do I extract margin on the tokens that are going through? Has to be on the token path and everybody has to ask that question. And I've just thought they've been incredibly thoughtful about it. And one reason is if you ask, you know, Michael, what type of company are you? They are a developer company for professional developers. That's what they are. They're a dev tools company. They're just focused on coding and that's a huge, I mean even if you didn't do AI, that's a, you know, they, they, they, they acquired graphite. I mean, like, you know, listen, we were investors in GitHub, like we know how big this market is, so that's a massive market even without becoming a model company. But they've also been quite successful in doing their own mod. I think it just shows you that if you are focused, you have a large use case, there's a huge opportunity not only to get the application but to start building your own models. Are these going to be the only models people use? Of course not. But you know, they are in a great position to serve great models and they've demonstrated that.

53:52

Speaker E

Yeah, my sort of thesis, which we're not going to have to go into here is actually, I think what I've been calling Agent labs, which are people who build on top of all the other models will probably have a better time with the margins because they price against the end user hours spent or like human labor. Whereas models get commodity price per token. And so margin wise, we know inference economics for model labs, but Agent Labs, the difference is the delta between token intelligence, which keeps going down, and human costs, which keep going up.

55:19

Speaker B

Yeah.

55:56

Speaker E

And so the margin should be, be higher.

55:56

Speaker B

They should be. The caveat to that is if the models go first party, right, what they can do is they can.

56:00

Speaker E

Which is the composer dream.

56:08

Speaker B

Yeah, they can subsidize themselves. The models. They can subsidize themselves. They can subsidize themselves and then they can charge the third party more. And it's a very delicate dance because you're kind of competing with your own customers. And so, you know, we've seen this historically. We saw this with the cloud, with EC2, like so this is not unusual. We saw this with the operating system. It's not unusual, but it's playing out very, very quickly.

56:09

Speaker D

Yeah, thank you for joining us. That's all the time we have today.

56:32

Speaker B

Such a pleasure.

56:34

Speaker D

You're welcome back anytime.

56:35

Speaker E

And thank you for being so open and also like just leading the industry in so many areas. It's really inspiring to see.

56:37

Speaker B

So thank you so much.

56:43

Speaker A

Thank you for having us.

56:44

Speaker B

Thank you.

56:45

Speaker C

Thanks for listening to this episode of the A16Z podcast. If you like this episode, be sure to, like, comment, subscribe, leave us a rating or review and share it with your friends and family. For more episodes, go to YouTube, Apple Podcasts and Spotify. Follow us on X16Z and subscribe to our substack@A16Z substack.com thanks again for listening and I'll see you in the next episode. This information is for educational purposes only and is not a recommendation to buy, hold, or sell any investment or financial product. This podcast podcast has been produced by a third party and may include paid promotional advertisements, other company references, and individuals unaffiliated with A16Z. Such advertisements, companies and individuals are not endorsed by AH Capital Management, LLC, A16Z or any of its affiliates. Information is from sources deemed reliable on the date of publication, but A16Z does not guarantee its accuracy.

56:48