TBPN

FULL INTERVIEW: Alex Karp on AI, Job Loss, and the Future of Work

28 min
Mar 12, 2026about 1 month ago
Listen to Episode
Summary

Alex Karp, CEO of Palantir, discusses how AI will displace white-collar jobs, particularly affecting highly educated workers who aren't neurodivergent. He argues that America needs major policy reforms including restricted immigration, vocational training overhaul, and educational system changes to prepare for widespread job displacement while maintaining military AI superiority over China.

Insights
  • AI will primarily displace white-collar jobs of highly educated workers who lack neurodivergent thinking patterns
  • Companies that combine human expertise with AI (hybrid model) will outperform pure software solutions
  • America faces a choice between proactive policy reforms or eventual social unrest from mass unemployment
  • Vocational training and alternative education paths will become crucial as traditional academic skills lose value
  • Military AI deployment is essential for national security even as domestic AI use requires careful regulation
Trends
Shift from pure software to human-AI hybrid business modelsIncreasing value of neurodivergent thinking and creativity over traditional academic skillsGrowing need for vocational training programs modeled after European systemsRising risk of social unrest due to AI-driven job displacementBifurcation between domestic AI regulation and military AI deploymentMovement toward nationalizing AI technologies as job losses mountDecline in value of traditional white-collar skills and credentialsIncreased importance of tribal knowledge and institutional expertiseGrowing competition between US and China in AI military applicationsNeed for immigration policy changes to protect domestic employment
Companies
Palantir
Karp's company providing hybrid human-AI solutions for government and enterprise clients
Goldman Sachs
Used as example of traditional corporate jobs that may become less valuable
BMW
Example of German vocational training producing skilled manufacturing workers
Airbus
Cited as example of complex manufacturing requiring high-level vocational skills
People
Alex Karp
Palantir CEO discussing AI's impact on jobs and society
Dario Amodei
Referenced regarding predictions of 50% early-stage white-collar job losses
Quotes
"There are basically two ways to know you have a future. One, you have some vocational training. Or two, you're neurodivergent."
Alex Karp
"If you're not neurodivergent and you're like lawyer 14506, that's a problem."
Alex Karp
"We're going to take your job, we're going to eviscerate your ability to have money and power, but we're not going to defend you on the battlefield."
Alex Karp
"There's gonna be a movement in this country that gets very strong very quickly to nationalize these things."
Alex Karp
"Making rich people miserable is the only way to help poor people. That's obviously true."
Alex Karp
Full Transcript
3 Speakers
Speaker A

Well, we are joined by Alex Karp. Welcome to the show. Thank you so much for taking the time.

0:00

Speaker B

You guys went with or without hat?

0:05

Speaker A

Whatever is comfortable for you. You can do hat out here.

0:07

Speaker C

Keep your hat on.

0:10

Speaker A

I feel like this is an average day for you. You're always out skiing.

0:11

Speaker C

We should have done that.

0:14

Speaker B

Yeah. We need to get.

0:15

Speaker C

We're close to getting skis. Look

0:15

Speaker B

physical.

0:19

Speaker A

It's starting to stick.

0:19

Speaker B

It's starting to stick. Oh, I like this. But now I feel like a newscaster. Very important.

0:20

Speaker A

Yeah, this is good.

0:25

Speaker B

You're, like, 8ft tall. People don't realize that he's not standing up, because it would be like he's like. For all of us.

0:27

Speaker A

Last time we talked to you, I think you were doing four minutes on the dead hang. What's it up to now?

0:37

Speaker B

500. 505. 505.

0:40

Speaker A

What about in the cold, though?

0:42

Speaker B

Sorry, we need to have a special minute for that 505. For those of you who haven't done a dead hang, first of all, go do it.

0:45

Speaker A

Why is it important?

0:51

Speaker B

Well, there. There are very few things that are proxy indicators that are accurate for health. Yeah, it's like dead hang, farmers walk, body weight, and VO2 are the three ones that count.

0:53

Speaker A

Okay.

1:01

Speaker B

I don't think anyone really.

1:02

Speaker A

One rep max bench press. That's all I focus on.

1:03

Speaker B

Okay. Well, you know, it's like, I feel

1:06

Speaker A

like as long as I have a really impressive bench press, I'll. I'll live a short but glorious life.

1:08

Speaker B

Yeah. I don't know.

1:13

Speaker A

Did they say something about that?

1:14

Speaker B

Yeah.

1:15

Speaker A

Who wants to live a long, glorious life when you can live a glorious, glorious, short life?

1:15

Speaker B

Yeah. I think that's what they tell you before they give you a bad salary. I think it's like, yeah, it's. Or it's like, yeah, my social life is so great. I only have a bot, but I enjoy it. Yeah. The kind of logic. No, but no, it's.

1:20

Speaker A

Dead hang is important.

1:33

Speaker B

Dead hang is crucial.

1:34

Speaker A

Okay.

1:35

Speaker B

And you really need to go work on it. Especially anyone watching your podcast.

1:36

Speaker A

Yes.

1:40

Speaker B

Is likely to outperform.

1:41

Speaker A

Yeah.

1:43

Speaker B

You want to have something, you know, you want to be able to do something with that outperformance.

1:44

Speaker C

Yeah.

1:48

Speaker B

Like? Like. Yeah. Well, the dead hang may be a proxy indicator for other things you could do with your outperformance.

1:48

Speaker A

Yes.

1:53

Speaker B

You know, but, you know, not everyone is. Yeah. Not everyone is, like, 6 foot 9, and, like, you may not need a dead hag, but the rest of us.

1:53

Speaker A

Well, I can cheat, because most of

2:01

Speaker B

the pull up Bars.

2:03

Speaker A

I just stand on the floor.

2:04

Speaker B

That's right. Yeah.

2:05

Speaker A

I can hold forever.

2:06

Speaker B

You can forever. That's right. That's right.

2:06

Speaker A

But, but when you're not dead, hanging. What should people be doing with the new coding agents? How important is it to learn to code? Important is it too.

2:09

Speaker B

Look, there are two that everybody's worried about, like their future. But there are basically two ways to know you have a future. One, you have some vocational training. Okay, so it's like. Or two, you're neurodivergent. And, and I, when I say neurodivergent, I mean broadly defined. Like your guys are sitting here. You could have had a corporate tool job. Yeah, you could have been like, I don't want to pick on Goldman, but like just say, you know, like a job.

2:16

Speaker A

I applied there, they turned me down. Well, yeah, I did work at C.

2:40

Speaker B

You could add a job where you're like, actually maybe they didn't know the right way to test. Yeah, like, it's like, yeah, they were like, like, you know, whatever. I'm not picking on one or the other. I'm just saying you, you, you know, like you are probably here, you think you're here because, but it's actually, you probably wouldn't be able to do that because there's like, it's the same thing as sit down in class and learn some like, and you just regurgitate it. Like that's not a valuable thing if you are actually have insights into anything and you have real technical expertise. Like, you know, you can look at a company but you actually can look at it because you know, something about how these things work or something about how clients work. You know, then all the other stuff that used to be precious, like being able to do low end coding, being able to do low end lowering, being able to do low end reading and writing. I mean this is like, I feel like ODIN came down and was like, I'm going to make the world just right for a dyslexic. It's like, yeah, I've. Odin has come down from our lockyer come down and said, you know what, Carp, you suffered so much as a kid. Yeah, I'm just going to make the whole world so everyone else can suffer. I don't want that. It's like now. But it's really an inversion. Like everybody with like the normal shape skills are dyslexics because like the meaning, the thing they can do that used to be valuable is not so valuable. The thing that they need to learn to do is like be More of an artist, look at things from a different direction, be able to build something unique. And you see this on the battlefield. Like one of the most underappreciated things about fighting a war, which is the. I mean, there are basic core things civilizations do, like build technology for war is. Every society fights differently, Every component of the society fights differently. And when, like America and his allies, we do not even approach these problems in the same way. What makes America lethal more so than any currently country is like a combination of obviously the technology which super interested in and believe are paying a huge part in, but it's like 20 years of, like, operators figuring out what worked what, not what worked in a manual, like, what worked in reality. Also, even selection, if you look at the selection of people, like you meet, like tier one operators, they don't look anything like what people would think. It's not like the movies. They're like these, like, these, like this big. And like, you know, it's like, because yeah, we have, we have, we have specialized ways of doing that. All that is crazy valuable as a proxy indicator. People who are getting their news from you are just likely to massively outperform. People are getting their news from something that's a regurgitation of you got to vote for one party or the other. And then the real problem we have in society is not your listeners or Palantir's customers or our partners is like, well, what happens to everyone else? Are they going to lynch us? Because, like, that's the real problem. Like, these products, like, what we're building, like our agents mean that, like, the most. I mean, the most powerful people in the Democratic Party are highly educated female voters. And these technologies, like, they love. I mean, like, I actually get along with all these people in private and there's a public dispute, but, like, largely, I've talked to Daria over and over again. It's like, yeah, you love one company because they're not pro Trump. That company's taking your job. How are you going to feel about that company when you find out you have no job? What do you think the Republican Party is going to do to products that do not support our military? What do you think the Democratic Party is going to do to products, even if you're voting for them, that are taking away the jobs of every one of your constituents and saying, oh, people are going to love you so much and you're going to be poor, by the way, we love you so much, we're going to give you a little handout once a month. Yeah, yeah.

2:45

Speaker A

So apply that to the SaaS apocalypse narrative, the enterprise SaaS. There's an idea that the first jobs that will be taken might be the enterprise. Software products that exist haven't really innovated, have locked in and they're going to be replatformed.

6:26

Speaker B

If you. The thing that these technologies do is they also make it harder to lie about. This is part of the political problem. If something's not creating value or something's not working or there's corruption, you can't lie about it. And nobody believes that all software companies actually create value. I mean, the famous thing that we all learned was that we rejected that. You were learned and people taught you is like your software company is supposed to give the client a feeling they're getting laid while they're getting fucked. Now, if that's how your products actually are, now it's, you are going to get fucked. And this is going to happen so quickly. And the simple test for people who are looking at this is, does this product, or in our case, we were never pure software. We're actually like a hybrid of like humans, FTEs, augmented humans, so AI, FDEs, and then orchestration, and then essentially what we would call primitives, like taking the tribal knowledge of institution, coding it into logic, and then using that to be extended in LLMs. Okay, but we don't have to explain that to our clients.

6:41

Speaker A

You know, so, so, so there's There was always this. Always Palantir consulting firm. Is it all just people? Is there any real software? I feel like that narrative went away.

7:48

Speaker B

Oh, no, no. They couldn't invest in us because we were a services company.

7:57

Speaker A

Exactly.

8:01

Speaker B

And now. Now it's like. And now.

8:01

Speaker A

But is, is. Is actually having that service better in the future?

8:03

Speaker B

No, no, it's not better.

8:07

Speaker A

It's underrated. It's crucial.

8:08

Speaker B

It's crucial. Yeah. Like all these places that made fun of us.

8:10

Speaker A

Yes.

8:13

Speaker B

They're running around and trying to get fd. Getting an FTE is like. Yeah, it's like. Yeah. It's not as easy as it sounds because you have to know how to manage it, where to put the person, how to extract value. And then you need all these products that augment the fd. What are those products? Ontology, Foundry, fd, AI. Things that we've built.

8:14

Speaker A

Yeah.

8:33

Speaker B

What we.

8:34

Speaker A

So the value of the business is not a monolithic code base that never changes. It is the people, it is the deployment, is the relationships. Is that how you're thinking about the business?

8:35

Speaker B

These, this. Well, actually, the way I'm telling You like when you walk around here? Yeah, they only care. They don't care about any of that. What they care about is you transform my business in three months, it would have taken three years and I would never have happened. Yeah, that's what they care about. Now then there's a question of how do you do that? And that is the. That is a concatenate artistry. It's like select client, select where you would start. Select ways in which innovate in ways they would not accept. Innovate in places they do not understand. You should innovate. Learn to manage these very complex. By the way, it's not just culturally complex, it's tribal knowledge. And much of that tribal knowledge is in rules that they have to apply. Because there are all sorts of rules around manufacturing, hospitals, war rules that are applied that they're not saying they apply laws, like all sorts of regulatory things. On top of all that, all that has to happen very rapidly. So you would need. And like without going into details, like I'm in the middle of like every single one of these discussions in almost every breakdown, it's like people do not understand how institutions work. They don't understand how the software would work. They don't have the LLM would work. They don't have the product that would actually work in that environment. And they still, at the end of the day are not saying we're going to charge on value.

8:43

Speaker A

Okay, how do institutions work? Why is it that we get these genius models that are 160 IQ, they can solve incredible math and they're not just like everywhere, all the time. What is slow?

10:00

Speaker B

I mean, the simple version is they're 160 against a test. Yeah, but the test isn't. It's a concatenation. The simple math would be it's 160 on one test. But you've got to pass differentiated tests over a long period of time. So it's a thousand tests.

10:11

Speaker A

Yeah.

10:26

Speaker B

So de facto, by the 50th step, it's 0 IQ. But then there's also. There's also. Yeah, I mean it's like, it's, it's, it's insane. Like, no, I love when I hear about all this is going to replace and then I go to our clients and they're like, could we have more? We don't even have the capacity. Like, it's like, it's a surreal thing. Like, would you guys like to be FDs? Because we need some help. If you're in the audience completely, seriously. And you're Aligned, broadly speaking with. America is a great country. You have to agree about anything else and you're out there and you're technical or just smart, apply. We need you.

10:26

Speaker A

Okay, America is a great country. Put aside Democrat, put aside Republican. Is democracy the correct formulation to decide the future of AI? Should the American people be voting be handled by private companies?

10:59

Speaker B

Now, America? Well, it depends. Like, yeah, great. So in the war fighting context, the Department of War has to be the arbiter of what gets deployed.

11:15

Speaker A

Now as a citizen, I vote for the Department of Work.

11:27

Speaker B

Exactly. Yeah. Okay. But, but I'm just saying. So I want to split domestic and, and foreign because like we in this country have God given rights literally given to us by a higher being. There's a right of free expression which we're exercising all the time and it's very important to us. There's a second Amendment which I exercise. I, I shoot very well. I would encourage you guys and other people to listening to avail yourself of the Second Amendment. Yes. It was not. It is there to protect ourselves in case the First Amendment fails. That's the reason it's there. There's a Fourth Amendment which is essentially we have a right to privacy. Okay. We have those rights. Adversaries trying to kill us in Iran do not have those rights. Sure. And I don't believe, I've never believed in extending our rights to foreign countries that are adversarial to us. I don't even really believe. I don't, like, you know, Germany, where I lived half my life. They don't have a First Amendment. They don't believe it. And by the way, they've never believed in the First Amendment. They have other rights. That's great. I'm not going to dispute that. But I want our rights here in this country. If you're going to tell the American people you're building what is clearly a dangerous technology, it's dangerous because it will likely take your job, especially if you're white collar. So if you're voting, you know, you're highly educated.

11:30

Speaker C

Have you flipped on that in the last like six months or so? Because I think, I think the last time we talked your general mindset was like high agency, highly productive people will be able to continue to leverage the tools to deliver value.

12:50

Speaker B

Yeah. I think if you're neurodivergent and high agency and you're highly educated, that's great. But if you're not neurodivergent and you're like lawyer 14506, that's a problem. Okay. But Let me get to this. And they're linked, but it's okay. On domestic stuff. I, we have rights that are not subject to majority rule. Like I, the majority can vote against us having fourth amendment rights. I want that, I want that litigated at the Supreme Court because I, we are not. Our Constitution is not about majority. It's actually about the rights of the minority. And it's our right. I bet you the three of us have opinions that are very much in the minority that we want to be able to say at least in the privacy of our own home. Right. And so there are real issues. I'm super sympathetic with restrictions around the use of these products in a domestic context. Even though it's funny people out there, every conspiracy theorist thinks it's insane. I'm the only one. Conspiracy theorists. You may hate this, but there's one person protecting your rights to be a conspiracy theorist that actually has a seat at the table and that person is me. You may not, you may not want to hear that truth, but it's, it is fucking true. And maybe do a little more reading before you pontificate on your absurd and obviously ill formed in many times stupid opinions. Okay, so because like you're attacking the person who's protecting you. Idiot. It's like fucking so stupid. Do, do use one of the bots to correct your opinion. It's like I'm being attacked online now. He's like Dr. Karp is anti progressive. Progressive my whole life. I'm just telling you the truth. These things are going to take your job. Okay, so then, but in the war fighting context, and it's the primary justification for these products has to be. It's, it's. They're two relevant powers now, US and China. This is a have have not world. It's going to be either us or them basically deciding the world order because like these other countries and maybe India will get involved, maybe the Arab, non Arab Middle East. But currently on the trajectory we're on now, there are two places where these things are being developed and deployed. It's us or them. And I'm not particularly, you know, I'm not, I'm not out to hurt China. I'm just out to. I think we should win. I'm not trying to hurt them. And in that context you can't say we're not going to do X, Y and Z. I mean I give you examples but like there are data sets that are publicly available in the US market that should, I don't think should be used against you. And me in a lawful law enforcement context more than with the help of, say, AI agents and ontology. Yeah, but if you don't use it on the battlefield, obviously Iran's going to use them. You don't think they can go online and buy those products? And by the way, without going into somewhat classified data, those things are in combination with other things, but lethal. Like, a lot of people who want to hurt America on the battlefield end up dead because of our ability to aggregate and then figure out what's going on in the battlefield before they can figure out what we're doing. And so, like, I'm very much in favor of it for moral reasons, but I'm also in favor of it. I don't know how else you explain this to the American people. We're going to take your job, we're going to take away, you're going to eviscerate your ability to have money and power, but that we're not going to defend you on the battlefield. It just seems like, yeah, well, they're gonna, you know what's actually gonna happen? That nobody believes me in tech, but there's gonna be a movement in this country that gets very strong very quickly to nationalize these things. First, it's gonna be take away all our money. The billionaires are evil. You may not have heard that. Super evil. And if you take away their money, it'll help poor people. Yes, that's really important to understand. Making rich people miserable is the only way to help poor people. That's obviously true. Say once you've learned that, the next thing you're gonna learn is the. We have to nationalize it.

13:07

Speaker C

They're going to, they're going to quote you on that. Yeah, well, it's like quoted you on.

16:50

Speaker A

So, so, I mean, it sounds like you're, you're, you're closer to Dario on, you know, potentially 50% of early stage white collar job loss. Like, you're, you're aware that there's a risk. At least everyone's aware you're aware of it. What do you see as the solution?

16:55

Speaker B

Well, first we just have to, I mean. Well, I mean, the obvious thing is, okay, we can't have any migration here. Like, how are we going to create more jobs? Like, it's like you have to. The problem, in fairness, not that people want to be in the business of being fair to policy leaders, but that we are dealing with technologies that will determine the policy decisions. So you can't just pretend they're not happening. Like, step one is like, we, it's going to be hard but possible to make this society work given that transforming it requires these technologies. Like all like, I really like the people who are here. They're not here because they like me. Like maybe they're here for my jokes. High quality in some cases, but it's like a long trip and we're here and I'm the only one who likes this weather is great weather. I brought it for you. But they're here because they've seen their business being transformed and this is happening in America more than it. So we have to win those battles. But the costs are going to be very high and so you have to work back from. Okay, the costs are going to be very high. We can't put oil on the fire. It's like, you know, it's like, well, getting jobs for all Americans is going to be hard and people maybe who become Americans. But it's like you have to have different policies around migration. You have to different policies around how we train people. Like currently, if you're a young kid in high school and you're neurodivergent, they're literally track chaining into your chair and feeding you medication so you can have skills that are not valuable. Yeah. Like it's so it's like. And then we'll probably over time have to have like a discussion of like, yeah, if you go into this career, you're not going to have a job. Like a really honest discussion about that. These are the places where you will likely have a job and yeah, help me.

17:10

Speaker C

You know, it seems like we as a country will probably head down a more European path where it is becomes very, very difficult or near impossible to let people go. Do you think that's correct?

18:53

Speaker B

Germany?

19:09

Speaker C

It's much harder to. To lay someone off. I mean that does impact. I mean that does impact the, the growth of. Of companies. But I think many Germans would argue that's probably.

19:10

Speaker B

Yeah, you know, Germany's. I mean, I won't. I. I'll answer your question. Germany's interesting place. I did this thing in German where I basically told the truth, which you're not allowed to do in Germany. It's like, you know, it's kind of really bad situation and the economy sucks. The migration thing's a complete disaster and the energy situation is like compounds everything. And I got thousands of people literally saying thank God someone told the truth. And there are a lot of people like you guys, young people building things that feel hampered and are correct to feel hampered. I think the American version, if we're not careful is not going to be the German version. I think it's going to be hang the rich. It's like, I think it's going to be not protect everybody else. It's going to be like, look, this is too dangerous and we're going to hang the rich but not really help the poor. And in fairness to the German version, like you know German, like health insurance insurance, all that stuff, it works like I was poor in Germany for like a decade and like I had the best life on the planet. Like, it was like being poor in Germany is like being better than being rich here on some days.

19:22

Speaker A

So basically policies that lift the floor re education, training.

20:27

Speaker B

So if you want to do what we could do here. Yeah, we, the things we could adopt from Germany are. Germany has three high schools.

20:32

Speaker A

Yeah.

20:39

Speaker B

Two are vocational. Sure. One is academic, better education, better programmatic, Vocational, vocational. It also has a bad like a weird vibe here. Like vocational training in Germany is very technical. Like the people building the cars at BMW or even in the French version, Airbus, like very complicated jobs. They didn't go to college, they went to a very, very high end high school and they come out without any debt. And that stuff is really valuable.

20:39

Speaker A

Yeah.

21:05

Speaker B

So if you want to, you have to completely transform our educational system and, and go very young into like training people to do things. You also need to change our testing system. Like different forms of intelligence are. All of our tests are built around things that were valuable in the industrial revolution. It's like you want to pull out all the dyslexics, all the neurodivergence, everybody who can't sit or needs to build or wants to build, have to go into a separate slot of like, yeah, we should have gotten you before you got turned down at Coleman and like said this is like that's a waste of your time. You could be building something important and

21:05

Speaker C

what else goes would be a part of the good outcome.

21:37

Speaker B

Well, the most important part of a good outcome is we show our adversaries. You can't fuck with us. And we're the best, we're the best military in the world. Hope I believe we're doing that right now. On the good outcome side, we, yeah, we go around and then on the commercial side we go to all these high infrastructure, you know, hospitals, manufacturing all these things, complicated infrastructure and we AI enhanced all of them. So the products are legitimately the best in the world. And we rebuild manufacturing in this country. Like a big problem for us, including on the battlefield is our manufacturing just is not up to where we have to be. And that, by the way, requires rescaling, scaling humans. And we're doing this all over the place. I mean, the guy writing a lot of the scripts for the target, these people, they're like high school college grads. The people building batteries and all these things using our products, these are high school college guys. There's a lot of opportunity there. But you know, one of the things I told the Germans and I would say to us is we, I was like, you know, Germany, you have to call it a crisis. We do need like, this is a crisis moment. America isn't. Tomorrow is not going to look like it looked at all or we're going to have radicalism on right or left. The problem, the danger is if we don't do these reforms, you are going to get the pitchforks because then the only solution people are going to have is, well, you know, let's go after the unlikable rich people in tech, especially AI tech. And then, but then what can work is, yeah, close the borders, keep them closed, start doing huge vocational efforts. Change how we test aptitude. Like so we have an accurate diagnostic of where you could be slotted. Be ruthless and like, you know, it's like in certain. Find out new ways to test and do ruthless testing and slotting and then also go around to universities and just, I mean, you know how when you, like you want to smoke a cigarette, it's like this, this cigarette may be harmful. Maybe we should be putting that in universities. This university, this, this university is harmful for your investor. You know, I'm libertarian. You want to go to university, student

21:41

Speaker C

debt may be harmful to your future

23:36

Speaker B

and your personal life. Explain to someone you got a million dollars in debt. Million dollars in debt. I mean, maybe if you're six nine and you can get away with that, but the rest of us have to provide.

23:39

Speaker A

That's funny. Help me square this idea. You, you were, you were talking earlier today about people misunderstanding your business and.

23:50

Speaker B

Yeah.

23:59

Speaker C

What's it like to read about your business?

23:59

Speaker B

Oh, I mean, first of all, it's, I mean the part I, I hate it, but then the part I love is, and it's like you are valuable. Your, your value is pretty directly convergent with people's inability to understand what you're doing.

24:02

Speaker A

Yeah.

24:18

Speaker B

So it's like, it's like all these technologies are potentially commodifying everything.

24:19

Speaker A

Yeah.

24:23

Speaker B

Okay. So if you are a business that is, you know, not services, not product, but both, but also works on tribal knowledge, on data, and every single business you make is individual. And so, yeah, that's a crazy valuable business.

24:23

Speaker A

True in a lot of industries where if the broader business community doesn't understand your business, you might have a short report, but you'll have a way less competition because people aren't copying you. They don't understand the playbook.

24:38

Speaker B

It's impossible to copy certain, like. And we neglect this, like, you know, almost like you even see it culturally like luxury products dominated by the French, watches dominated by the Swiss, currently certain kinds of war fighting dominated by America. And it's like, it's, it's very hard for people to eviscerate these cultural advantages. And our products augment that which makes it, you know, augments the differentiated specific over the generalizable. And that's where literally all the value is going to go. And it's going to be like a waterfall. And that's the problem with a lot of software companies. It's like product abct. But then when you read it, it's like one of the more depressing things that you guys probably confront. But it's like a huge market opportunity for you. It's like, where are the experts? Like, it's like, you know, it's like there, it's like, you know, you, you hope and pray. Like, I'll tell you the funniest thing about my life now. And people internally know almost every day. I'm like, wait a minute, I'm the adult in the room here. It's like everywhere I go, it's like, it's like, wait a minute. I, like, I. And it's like there it, it's like, and, and it's big. So it's, it's surreal when you read about these things. But, and it's, it used to really frustrate me, but now I kind of just think, well, like, I can't believe we're still viewed as crazy. It's like everything we're doing is the only thing that's working. I mean, like, I don't want to like, spend a lot of time on our baller essential, essentially baller numbers from last year. But it's like, you know, clearly our shit works. Clearly nothing is working at that level. And you would think they would take like, I don't know, 10 minutes to think, okay, well, the thing I believed and I thought would work didn't work at all. This thing I thought was insane is, has like a rule of 127 when like no one, like 40 is considered like, and like, but they don't. And, and, and then. And I. Yeah, but it's sometimes frustrating, honestly. And the hard part actually is I kind of view it as a feature. Internally. We get these bright eyed kids, so it's so funny. I mean, the 1 get the best people in the world, but you know, just like I was probably at 21. They're very romantic. It's like, but why does the adult not understand this? It's like, but, but, but the adult expert tells me it's like, I don't know when you guys had to drop the shoe moment and you realized that like the adults are like, you know, on crack or something.

24:49

Speaker A

Like, it's okay.

27:11

Speaker B

Yeah.

27:12

Speaker A

Last question. Would you rather have $10 million or access to ChatGPT in 2012? It's a viral question that's going viral right now.

27:13

Speaker B

I have to choose one or the other. I mean, okay, I'm just.

27:25

Speaker C

I don't think he needs.

27:28

Speaker B

I don't think he needs my social life in grad school. There we go. Okay, that's on your p. How about a new pick?

27:29

Speaker A

We just have to ask.

27:38

Speaker B

You know, it's like, great. I got to take something I valued. Oh, okay. Yeah.

27:39

Speaker A

The most valuable thing, social life in grad school. Well, thank you so much for taking the time to come ch. It's fantastic. We'll talk to you soon.

27:43