Is this your last "job"? The AI Economy With AEI's Brent Orrell
51 min
•Jan 27, 20263 months agoSummary
Brent Orrell from AEI challenges alarmist narratives about AI's impact on employment, arguing that technological disruption will be gradual rather than catastrophic. The discussion emphasizes that meaningful work transitions require focusing on problems to solve rather than specific jobs, and that adaptability and domain expertise matter more than narrow skill training.
Insights
- AI job displacement fears are driven by evolutionary anxiety bias, not economic reality—most Americans have basic needs met and should focus on meaning/purpose rather than survival
- Technology adoption takes decades (electricity took 20 years), creating time for workforce adaptation if policy addresses domain expertise acquisition for entry-level workers
- The most productive AI users are experienced workers with 15-20 years of tacit domain knowledge, not young tech-savvy workers, inverting common assumptions about generational advantage
- Rigid employment classifications (employee vs. contractor, student vs. worker) from the 1930s-70s now inhibit the portfolio/meandering career paths that AI economy will require
- Broad liberal education (history, philosophy, founding documents) builds mental flexibility and dimensional thinking more valuable than narrow AI degree programs
Trends
Shift from job-focused to problem-focused career planning as economic uncertainty increasesGrowing recognition that tacit knowledge and domain expertise are AI-era competitive advantages over technical credentials alonePolicy lag: employment law classifications haven't evolved to support portfolio careers and skill-stacking that modern economy demandsRevaluation of humanities and liberal arts education as foundational to adaptability in AI-driven economyEntry-level job scarcity as automation reduces training pipeline, threatening domain expertise acquisition for new workersMedia sensationalism around AI job losses conflates correlation with causation, obscuring actual economic driversRising focus on psychological resilience and meaning-making as economic abundance increases and traditional work identity erodesUneven technology diffusion across sectors creating 'jagged edge' adoption patterns rather than sudden disruption
Topics
AI Impact on Labor Markets and EmploymentWorkforce Development and Job TrainingTechnology Adoption and Economic DisruptionCareer Planning in the AI EraDomain Expertise vs. Technical SkillsEducation Policy and Curriculum DesignEntry-Level Job Market and ApprenticeshipsEmployment Law and Labor ClassificationMeaning and Purpose in WorkMedia Coverage of AI and TechnologyEconomic Dynamism and Business ChurnTacit Knowledge and Organizational LearningAdaptability as Core SkillLiberal Arts Education ValuePolicy Responses to Technological Disruption
Companies
Palantir
Discussed as example of alternative workforce development model recruiting high school students with intensive libera...
People
Brent Orrell
Senior fellow at American Enterprise Institute; expert on workforce development, job training, and future of work in ...
Kevin Frazier
Host and senior fellow at Abundance Institute; director of AI Innovation and Law Program at University of Texas Schoo...
Ethan Malik
Referenced for concept of 'jagged edge' describing uneven technology adoption across sectors
Chowdhury
Author of 'Reshuffle'; argues competitive ecosystem drives skill changes, not individual task-based training
Quotes
"AI only works if society lets it work. There are so many questions have to be figured out"
Opening segment•Introduction
"We are poorly adapted to abundance and we don't really know what to do with ourselves in circumstances of abundance"
Brent Orrell•Mid-episode
"What's the problem that I want to solve? That's a very different way of approaching this question of work"
Brent Orrell•Career planning discussion
"The people who are benefiting most from AI are the ones who have spent 15 or 20 years in the workforce accumulating vast stores of tacit knowledge"
Brent Orrell•Domain expertise discussion
"It's like you think it's complicated building a chip fab? Wait till you try to build into people the flexibility and cognitive skills that they need"
Brent Orrell•Skills development discussion
Full Transcript
When the AI overlords take over, what are you most excited about? It's not crazy, it's just smart. And just this year, in the first six months, there have been something like a thousand laws. Who's actually building the scaffolding around how it's going to work, how everyday folks are going to use it? AI only works if society lets it work. There are so many questions have to be figured out and... Nobody came to my bonus class. Let's enforce the rules of the road. Welcome back to Scaling Laws, the podcast brought to you by Lawfare and the University of Texas School of Law that explores the intersection of artificial intelligence, policy, and the law. I'm Kevin Frazier, a senior fellow at the Abundance Institute, director of the AI Innovation and Law Program at the University of Texas School of Law, and a senior editor at Lawfare. Today, we're joined by Brent Orl, a senior fellow at the American Enterprise Institute, where his research focuses on workforce development, job training, and the future of work. Brent has emerged as one of the clearest and most empirically grounded voices in tech debates, pushing back against both technological alarmism and complacency. His work examines how artificial intelligence is reshaping tasks, skills, and career pathways across the knowledge economy. It's going to be a great podcast. To get in touch with us, email scalinglaws at lawfaremedia.org or hit us up on X or Blue Sky. And with that, giddy up for a great pod. Brent, thanks so much for coming on Scaling Laws. You know, I really appreciate being invited. It's great being with you. And I love the whole network that you're involved with in this endeavor. So good to be with you. Well, thank you very much. You know, my mom always said, make friends who are smarter than you and you'll look pretty good. And it's been working in my favor for a long time now. And now I like to extend the invitation to come into the inner Fraser AI circle to you, especially because I'm a big fan of your work. And we've got shared Oregon roots, which you can't say about most folks. So I appreciate that we share both a love for the Oregon ducks and a deep interest in AI in the future work. Yeah, absolutely. It's very rare. It's almost as weird as finding another person named Brent, which I rarely, if ever, encounter. Oh, fascinating. Well, if I find one, I'll send one your way. We try to keep tabs on each other. Exactly, exactly. You'll be pleased to know that my high school teacher in Beaverton sent an article my way not too long ago that, according to The Onion, there's a global shortage of Kevins. And so I'm trying to get people to change their name. This is the real public policy issue we need to be addressing is why more people aren't named Kevin or Brent, I guess. Nobody ever worries about the shortage of Brents. I mean, it's amazing. It's tragic. I know. Instead, we wake up to headlines like this this morning. This is from CNBC reporting live, I guess, from Davos. And the headline is AI impacting labor market like a tsunami as layoff fears mount. And Brent, I know you are no stranger to debates about the future of work and the idea that technology is going to wipe out jobs immediately. What's your gut reaction to a headline like this? And to what extent is it emblematic of kind of media coverage of this topic? Yeah, it's clickbait. And that's what it's designed to do. It's designed to tap into our very deep-seated preference for worrying about things that we can't see and leveraging that part of the brain that evolved to respond to fear. So, yes, that's what I think of it. Yeah. And the tragic point is that if you scroll down and you have to go through several paragraphs, you eventually reach a section of this article in which the author quotes several folks, for example, that say companies attributing much of the blame for AI job cuts to AI should be taken, quote, with a grain of salt as AI redundancy washing will be a significant feature of 2026. And then later, they quote another expert who says, I would argue that those 50,000 job losses are not driven by AI, but are just driven by the general uncertainty in the market. It's too early to link those to AI. So what is it, Brent, about the future of work and technology that kind of drives this immediate response to be able to say, ha, it's the AI that did it? And to what extent is this just the latest incident in seeing this sort of find the tech boogeyman and blame every economic woe on this issue? Well, I mean, it's multifactorial, right? There's not one source of where this tendency comes from, right? We've already talked about how we are built to – what I frequently tell people is, you know, every single one of us from an evolutionary standpoint is a survivor, right? We were among, in our ancestors, were the anxious ones, the ones that were constantly looking for danger. And they survived, right? So it's a survival function. And it was reinforced through natural selection over millennia, zillions of years, okay? We can't escape it. It's who we are. but it's very poorly adapted to our current circumstances. I teach a class every year here through AEI called More Than a Paycheck, and it's really trying to get young people in particular to step back a little bit and to think a little more realistically about the nature of the challenges that they're going to face in life. We refer back to Maslow's hierarchy where that bottom layer of food, shelter, warmth, that's what we were evolved to try to fulfill those needs. And it was hard. And again, we survived, our ancestors survived those challenges. However, in our current day and age, it's really difficult to not have those needs met. I mean, you can try – if you really, really try, if you make every mistake in the book, if you actively seek out disaster, you can – It takes effort, right, to find yourself with any sort of affordances. Yeah, you really got to try. And there are people who manage that. And typically, we're both from Oregon, so we've seen the homeless problem. Typically, these are people with deep fractures in their psychology or their personality, extensive trauma. It's just like they drew from the bottom of the deck and really got a terrible hand in life. They do exist. But as a fraction of the population, it's vanishingly small. So for most people, you're not here on the bottom of Maslow's hierarchy. Your life tasks are up here at the top, which is meaning, purpose, spirituality. You know, that's where I speak with the wisdom of my ears now. Now, that's where you end up spending most of your mental and emotional energy is addressing those things because all the other needs are largely met. So this is a long way around saying that we are poorly adapted to abundance and we don't really know what to do with ourselves in circumstances of abundance. and in all likelihood, we're going to be faced with even more abundance, I think, as a result of AI. And this crisis of meaning and purpose is still going to be there. And that's where we need to put our energy right now, I think. And I really like your framing of going back to sort of our base instincts, because another thing that I think distinguishes this point in time, this sort of AI moment, where there is such a temptation to say, AI is just the latest wave of every technology we've seen. We can just press play on the other policies or interventions we either leaned on then and apply them now. But I do think that comparing and contrasting the turn of the 20th century to, for example, this current AI moment, something that podcast listeners have heard me say previously, and apologies, folks, for repeating myself, But you cannot find a single article in the New York Times worrying about the state of the horse pooper scooper. There's just – you can't find it. There's no article that says what are we going to do? How do we protect this profession? How do we ensure that they have an on-ramp to their next job? What skills do horse pooper scoopers have that are transferable to the next one? And I think to your point, Brent, something that I haven't really thought through before is that psychologically we are being asked now to think about the future of technology over such a longer time horizon, right? You don't find as many discussions in the New York Times or whatever the most popular outlet was in 1908 when the Model T is getting sent around the country. You don't find people saying, well, what are we going to do in 1958 about this technology, right? That's just not a conversation. And yet here, the most popular articles are, okay, here's my AI 2048 report or my AI 2058 report. And so I think there's a natural response, which is to say there's so much uncertainty about that future. How, going back to those reactionary instincts, do I protect myself now? And that's a really hard thing to grapple with from a policy perspective. Yeah, and it's protect yourself from what, though? I mean, that's the thing. I mean, like I said, it's really hard to starve to death in the United States or to freeze to death in the United States. It's not impossible, but it's very hard. So if that's not really on the table, then what is? And you never read the article or rarely read the article about, you know, how are we going to generate meaning in our lives if we don't have work to do that for us? Because I think as Americans, that's what we do. We fill the meaning void with our professions and our working hours. And that's how we, in many ways, define ourselves. Even if it isn't like the job is so attractive or something like that. Even if it's just this is how I contribute to the world. This is how I support myself and my family. The dignity of work, we rely on it so much. And if that undergoes a transformation not that it going to be taken away because I think there always going to be work but as work is transformed and the skills that we rely upon the things that we good at are no longer relevant to the modern economy that where this crisis of meaning comes in And I don want to downplay this because I think we made a horrible mistake when we went through the automation and global trade shocks of the 80s 90s early 2000s And we simply didn't take this question seriously at all. And so while 80, 90 percent of Americans are better off today than they were prior to those shocks, it still left millions of people behind. And no real thought through policy framework for helping people navigate that transition. to find their way to the new thing and to attach themselves to the new thing so that they could continue not to just generate income, but to generate meaning in their lives. So that's a – I don't want to downplay this. I think it's really significant. I'm sure that the people who were scooping up the horse manure, there were some people for whom that was the only job available. And to have that ripped away from you, even that is a very painful thing. So we need to not downplay the psychological stress that people encounter when they do lose their jobs or when their jobs are transformed in a way that they're just not equipped to remain in that job. And so we need, it's a project I'm working on right now, we need to not make the same mistake again, which is to look at what are our policy options if we experience the kind of displacement that many people worry about. I think it's not going to roll out the way that people think it is. I think it's going to be much more gradual. I don't think we're going to wake up one day and I'll be out of jobs. It takes time. It takes capital. It takes all sorts of things to make this happen. And so we've got time to work on it. We've got time to think about it, but we've got to do it. Yeah, and something that comes to mind, and forgive me for leaning into CNBC's title here about the tsunami of AI coming, But what you're kind of reminding me of is if you frame things like a tsunami, the assumption is people can't swim, right? You don't survive a tsunami by swimming away. But if we know there's a rising tide and you can swim, you're going to be fine, right? You're going to find that next spot to land. But that change in mentality does have a deep psychological basis because if you're operating purely from a scarcity mindset – my wife will tell you, if I find out I'm going to be without peanut butter for seven days, I become a madman of consuming only PB&Js for the next 48 hours. You just do weird things when you believe you're going to be without those core essentials. And if instead adopting your sort of baseline optimism, you start to operate in a way different mentality. And I do feel as though reading the headlines, talking to average Americans, you find a lack of that optimism. And I think it is because going back to the China shock or whatever we want to call it, there was a sense that people were just forgotten and we moved on. And so I'm curious from a policy perspective, there's a real tension here because something that I've been trying to dive further into, thanks to my econ degree from the lovely University of Oregon, is the importance of economic dynamism, right? And making sure that the surest way through a transition isn't to defend the jobs of the past and say, hey, you have a right to that specific job for life, but instead to say, how do we become the place where the next jobs occur first? Because now you're providing new opportunities. You're having that positive economic churn. But on the same side, you want to make sure there is some ground where folks won't sink beneath, right, that they have a clear path forward. And so you've probably studied the history of these evolutionary technological moments and sort of societal adaptation to those. How do you begin to strike the balance between a heavy-handed interventionist approach that says, Kevin, we're going to train you to become a prompt engineer by 2026 versus something that's a little looser? Yeah, I mean, so again, I'll go back to what I say to the classes that I teach around this, which is first thing you need to do is relax. Okay, take a breath. Trust that in a continent-sized nation of 350 million people almost with a 30-plus trillion dollar economy, there's going to be a place for you in that. Somewhere, there's going to be a place. You're not going to starve. and it and so you got to just like shake it out a little bit and relax and then think about rather than what job am i going to do i think this is a really critical question for the ai era it's not what job am i going to do what am i going or even what education or what skills development should i have. I'm increasingly of the view that the right way to frame those kinds of issues about the future is to ask yourself, this isn't the only question you need to ask. This is just the first question you need to ask, is what's the problem that I want to solve? That's a very different way of approaching this question of work, what's the problem that I want to solve? That isn't the last question. That doesn't give you all the answers, but it taps into this idea of vocational calling, that each of us are built in a certain way that gives us certain strengths and interests that we want to try to apply. Because actually, I think that's where people pick up the meaning and purpose that they're looking for because it taps into this very deep part of themselves. Once you have defined that, what's the problem I'm interested in? What are my intrinsic interests in life? What do I do? Not because somebody tells me I have to or because it's supposed to be good for me. The things that I don't have to be coerced into doing. What are those things? And then what are the skills associated with those things? And where do those demand for those skills show up in the economy? I think that's the right order. And once you've established that as your package, then you can start thinking about educational options. um so that's my my sort of meta approach to this um question is you know uh don't fixate on income don't fixate on particular jobs or sectors don't don't fixate on particular skills like coders go talk to a few coders these days you know it's not no you know the thing that was supposed to be the golden ticket has become, it's still very valuable, but it's not that immediate on-ramp to the million-dollar salary that people thought it was. I mean, it's more nuanced than that now. So, start with this package and then start building your plan from that rather than, I'm afraid I'm going to starve to death, and therefore I'm going to pick a profession in which I have no interest, and I'm going to do that because that profession is going to protect me from disaster. It's just like the Terminator scenario is not a real thing. It's just not. And to go back to the analogy of the tsunami, it isn't a tsunami. I would analogize it more to like rising sea levels. You know, we have an opportunity right now to adapt to rising sea levels if we choose to do it. You know, hardening the infrastructure, getting people out of the way of, you know, where, you know, land is going to get flooded, whatever. You know, we have a lot of things that we can do. We can even change environmental policy to slow things down, Or we could, you know, geoengineer the planet so it cools down a little bit. I mean, there's all sorts of things that can be done. But none of us probably on this podcast listening to it, you and I, we're never going to see this problem of climate change in its most extreme forms. Even if it happens, we're not going to see it because it's 150 years in the future. So put that anxiety away and think about adaptation. So all that to say, same thing with work and automation and artificial intelligence. It's going to take years, decades maybe, to arrive at this place of a transformed economy. And so it's not really our immediate concern. Yeah. And what I love, too, about your framing of what is the problem I want to solve is a couple things. First, it's pro-social, right? It's not saying what is it about myself that I want to become in terms of I want to be X or I want to be Y or I want to achieve income level Z or income level A or live in this city or that city. But how do I be a good contributor to society? And this is the part that I do think we miss in so much of the job conversation is humans are really good at a few things. Number one, we create a crap ton of problems. We're never going to have a shortage of problems to solve, right? There are always, going back to your pyramid, there are always new ways we can increase meaning, we can increase community, we can further well-being. one of my favorites is could you imagine explaining to someone in the 1700s that there's dog walkers? Or even worse entire folks whose sole job is to just live in your home while you vacation and watch your dogs. It's mind-blowing. This is my daughter, by the way. It's no offense. I think it's a great gig. I think it's great. She's really happy with it. Great gig. and it's great that it exists and we will find other jobs that no one can imagine to satisfy new desires that we haven't even developed yet. And so I'm just not pessimistic about our human capacity to number one, I guess, create problems and number two, create demand for new things. And I also like that when you focus on what is the problem you want to solve, that's something that requires a deep search. And presumably, if you find that problem, You're willing to go do new things, go get that new credential, go learn that new skill, go apprentice, go move for opportunity because you're motivated. It's not, to your point, Brent, earlier of, oh, well, my dad wants me to study poli sci, so I guess I'm going to do this and hope the cards work out. No dad wants their child to study poli sci. In fact, my dad said I couldn't. We'll get through my dad trauma on another podcast. So Brent we talked a lot about how this isn a tsunami Why not Why aren we going to see the sort of forecast of we wake up on February 21st 2026 and suddenly there unemployment of 10 in Y profession Why isn that likely to occur You know, it's a good question. Let's look at what I think is the closest analogy to AI, which is probably the introduction of electricity. It took 20 years for the economy to reconfigure around electricity. it enabled moving you could have a factory anywhere because you were no longer dependent upon a power source like water or you know a nearby coal mine to keep your operation running so it could be done anywhere it could be moved closer to the other resources that were required and where the people were to do the work. It allowed for the reconfiguration of manufacturing facilities from a vertical layout to a horizontal one where you didn't have this central power thing that you built everything else around. You couldn't get away from that thing, so you had to go up and build around it. Okay, so when electricity came along, everything went horizontal. Now, 20 years, of course, is not a long time by any stretch of the imagination, but it's not tomorrow either. And we are talking about rebuilding our economic infrastructure around a new technology that is going to take a lot of time and money to do. I mean, there is a limited amount of capital available. Right now, it's mostly being poured into the infrastructure that we need for artificial intelligence. so it's going to take time to make that that transition it's going to be very uneven uh you know it'll happen some places and in some sectors and industries and businesses before it happens in others uh so it's going to be um i think it's ethan malik calls it the jagged edge you know like in some places the technology is going to race ahead some areas it's going to take a lot longer for it to be absorbed and then there are all the other constraints around it one of which is how do we get people skilled up to be able to do it that's all really complicated and takes a long time and a lot of effort and a lot of false starts you know like 95% of the current pilots around AI are said to fail, which sounds appalling, but it's true of every new technology. There's a lot of failure before people say, okay, we can't just bolt this on to what we currently do. We've got to rethink how we do it. I strongly, strongly recommend a book called Reshuffle by the author's last name is Chowdhury, C-H-O-U-D-A-R-Y. And he says from a skills standpoint, we're looking at the wrong or focused on the wrong thing. We're focused on individual tasks and skills associated with those tasks. His argument is, and I find this extremely compelling, is that it's actually the competitive ecosphere that will drive changes in skills. And that's why it takes time for businesses to restructure so that they know what kind of skills they need. Business can't tell you right now what they need. They're lucky if they can see a few months ahead, especially in the tech sector, but I think it's even more true in other sectors. They just don't know because they haven't done the re-engineering that they need to do in order to tell us what's going to be needed. Having said that, that in itself is extremely important information for all of us because what it means is under circumstances, under conditions of extreme uncertainty, the most important skill is adaptability. How are you a good learner and can you pivot? And those skills are not typically things that we sit in classrooms and learn. That's the stuff that we pick up as we move around the social environment and interact with family and friends and civic organizations and you name it. It's the school of life that teaches us those things, not the elementary school or – I mean elementary schools and schools are part of it. But that's not mainly – it's not formal education. It's tacit education that is most needed. I keep telling people, it's like, you think it's complicated building a chip fab? Wait till you try to build into people the flexibility and cognitive skills that they need in order to learn new technology. That is an arduous task. Much harder. Much harder than building a factory. And there's so many deep veins to tap into here. I think the first one that comes to mind is thinking through this slow process of technology diffusion and the friction that we've seemingly forgotten in some of these dire forecasts. Because if you've spent time at an organization and they just switch from teams to Slack, that in and of itself is a six-month, if not a year long, okay, we've got to do trainings. We've got to handhold this person. we have to explain what a channel is, here's how you use hashtags. All these things now apply that to something like AI, which is constantly changing and evolving and becoming more sophisticated. And those frictions just add up over time. And so that I think is so important to point out. And also the fact that in any economy, especially an economy of 35 million people, there's always some space for lower productivity firms. So even if you see a firm that isn't the first mover on AI, they don't disappear overnight just because their competitor is more efficient or more productive. Yes, they will eventually disappear, but it takes some degree of time. Yeah. And the other thing that I really want to highlight is this focus on the sort of entrepreneurialism of saying, look, whatever skills I need to acquire, I can go out and do that. And there's a sort of passivity that comes into saying that, no, I need to find it at this school or through this program. To your point, Brent, a lot of learning happens just by exposing yourself to new scenarios and inviting yourself into those scenarios. Now, obviously, having the time, having the opportunity, having the resources to do that isn't always as widely available as we'd like it to be. And I think that's something we can address. But I've learned a lot about life simply by living in Helena, Montana, and Miami, Florida. And let me tell you, you need to have very different skill sets. You need to have very different speaking styles. You need to have different habits, hobbies, interests, so on and so forth to thrive in those respective communities. And so a lot of this to me boils down to the folks who are curious and who are willing to learn how to swim in new environments are going to be the people we look for. And to your point earlier, we don't have existing institutions that necessarily teach those skills right now. And that's where I think we have a sort of culture that isn't suited to adaptability right now. Because if you go on to your TV, you can control every channel you see. You can control what episodes you're going to watch. You can control every feature of that TV. When you go to work, in many ways, you get to select who you talk to, at what point. You can tell your boss, say, I really don't like these projects, but I like these. With your own personal life, you're changing your apps. You're changing your whole life is seemingly within our control now. And I think that has a long-term consequence of making people more and more fearful of when they feel as though they lack some degree of control. And yet that willingness to experiment and struggle is seemingly going to be more and more important. Yeah, I agree with you completely. And I take it even a step further. I think that our approach to education, our approach to work actually has, often has the effect of depriving us of agency on these questions. You know, we have become, we are, and this is just a product of like the industrial, the legacy of industrialization. You know, it's like you go to school to learn narrow tasks. And particularly as you move, you know, along in your education, there's increasing premium on here's the educational input and here's the economic outcome and show me the straight line between these things. There's like, you know, your dad, my dad, you know, discouraging us from pursuing. I was a history major. You would have been in poli sci, but you did econ instead, which is almost as good. I never used my history degree formally. I was never an historian. But I used the skills that I gained in my history degree every single day of my career, bar none, because it taught me how to read, how to think, how to write. But it was building a whole bunch of implicit skills inside of me that I wasn't even aware that I was gaining or how they would be useful in the future. I am convinced that our focus on the acquisition of particular skills is very debilitating to us and that we need to revalue meandering in our education, which is, you know, try some new things. Make sure that you are, if you're really good in STEM-related fields, get some humanities to go along with that. Or vice versa, if you're an art history major, you know, please get some science and math and engineering and chemistry behind you. I really wish I'd done that. Because the study of these things inculcates different ways of thinking. Those ways of thinking will be valuable to you with or without AI What we seeing develop right now I think is that the people who are benefiting most from AI this is a little counterintuitive for me the people who are benefiting most from AI are the ones not that study computer science They're the ones who have spent 15 or 20 years in the workforce accumulating vast stores of tacit knowledge about whatever they've been working on. And then you take that and you sort of infuse AI into that knowledge. And what you get is the 10x, 100x improvement in productivity for that individual because they already know the business that they're in. They already know the substance of what they're supposed to be working on. And then it's just a question of how do I apply this technology to help me do more, to do it better, faster, whatever. And I, as recently as last fall, I was thinking, you know, the future really belongs to, you know, the young here because they'll have an easier time adapting this technology. And that has not, at this stage anyway, turned out to be true. It's like age and guile beats youth and energy still. There's a bumper sticker. Yeah. And so that's really the challenge right now, I think, for a lot of people who are trying to get into the workforce is that they have a bunch of skills that they've learned in their major degrees or in their community college or whatever it is. But they lack the domain expertise that they need. And it's one of the principal challenges in front of us. How do we get new workers, younger workers into places where they can acquire that domain expertise? Because acquiring that domain expertise doesn't have a lot of economic value right now. You know, businesses would do it in the past because they knew they had to. They, you know, they had the work that needed to be done. And they also knew that they had a pipeline that they were going to have to fill over time. So this is how we train people up, you know, with these more mundane cognitive tasks. And now it's just like, man, I don't want to hire all those people. I want to, you know, automate this stuff because it'll save me so much labor. So we've got to figure that out. I think we really need to look at, you know, what kinds of incentives can we provide to individuals and to business to maintain these entry-level jobs so that people can have that opportunity to learn the domain knowledge that they're not going to get sitting in a classroom. And I'm going to just add a little bit on to your notion of meandering because I think that's such a great word. I've been referring to it as the portfolio economy where at any given point in time, you're trying to develop and expand a portfolio of skills you can offer. And that may mean studying somewhere new, whether that's kind of vocational training or formal education. That could be having an apprenticeship at any given point to then deepen those skills and then to work in whatever arrangement as is required because we're not going to have the nine to five job for the single firm. And so you're constantly thinking through this sort of flexible new world and new economy, and yet our labor laws and employment laws are pockmarked by rigidity. The effort of the 1930s was to classify everything. And in the 1970s, we kind of doubled down on that of saying, how can we further put you in a bucket, categorize you, and make sure that you are trapped into this legal architecture? And while much of that may be well-intentioned, moving between buckets, the sort of meandering we're talking about, is really, really difficult. You're either an employee or you're not. You're either a student or you're not, perhaps with the exception of student-athletes, but we'll get into the NCAA and NIL in another podcast. But all of these classifications are outdated if we're anticipating a world in which meandering becomes all the more important. And so if you were advising a state governor, let's start there, what would be your emphasis? Because I think for a lot of legislators, particularly at the state level, they're hearing this drumbeat. They're hearing about the tsunami that's coming. Their constituents are concerned. Parents are concerned. And we're seeing a kind of rush to respond to that. But beyond taking a breath, which I think many of them should do, what's kind of top of your list? Our home state provides an interesting example in this. The Oregon Institute of Technology announced last week, I think, or the week before, that they had designed this new Bachelor's of Science in artificial intelligence. uh and so i thought well that's interesting so i started looking at you know like what's the content here and it's skills-based approach it is you know we create stacks of education that move toward a credential and um i think you're you're right that our general trends since maybe the enlightenment has been to fragment and classify information and knowledge and make smaller and smaller pieces. So this idea that you could kind of create a degree in AI and that that would be adequate preparation for the AI economy is, I don't have all the details on the Oregon thing, but it seems to me like it's not holistic enough, right? It is, again, this reliance that we're going to define this down into a series of courses that you're going to take and lead you through this progression, and it's targeted toward preparing you for a job In the AI-driven economy, we don't even know what those jobs are to start with. So how can you tell somebody who's investing tens of thousands of dollars in a degree that this is like the best way to do it? So, yeah, I think that that – anyway, so that – I think you're right about sort of our approach to knowledge is one of our problems right now. And the fragmentation of knowledge. And I want to link this back to what you're talking about in terms of curiosity. Curiosity is, you know, sort of the jet fuel of like how we learn things. And so turning to curiosity and honoring that is probably one of the most important things that we can do. Now, how does that show up in state policy, I think, is a really interesting question. And when I think about it, it's like the first example that comes to mind is you probably saw this thing about Palantir and how they're encouraging kids to go directly from high school into working for Palantir. I've got a lot of questions about that. It could be a great idea. It could be a terrible idea. I haven't decided yet. I'll bring you back on in a few months to get your answer. I just had a conversation with them last week, so I'm still noodling on it. But the really interesting thing about that project is that – so they recruit these manly men. They seem to be mostly young boys. I mean high school-aged boys. You know, they bring them in and what do they do for the first four or five weeks? They do an intensive introduction into the founding documents of the United States of America. Federalist papers, Constitution, Declaration of Independence, you know. And that was why I wanted to talk to them is like, why are you doing that? Right? Why? Tell me. Tell me why you're doing that. Now, they had the first response. I don't because I don't think they necessarily think in the terms that I would like them to think. The first response was, well, you know, a lot of liberal education is just bad and we want to give people like the good stuff. It's like, yeah, I'm I'm I'm totally in favor of that. What they don't realize is that it's a completely different way of thinking and absorbing knowledge and the way that it kind of gets inside people's heads and knocks down the architecture and rebuilds our kind of mental architecture or mental models of how the world works. And that has profound impacts on our attitudes and behaviors down the road. So I think that they're on to something, but they may not know why they're on to something right now. Or they don't have all of the full picture of why they're on to something. I think that that kind of education and training is vital to the future of work in an AI-infused economy. It's not the BS that's going to get you – the bachelor's of science in AI is not going to guarantee that you get the well-compensated job that you're hoping for. What may get you closer is to combine that with this other kind of knowledge around how do we perceive the world? How do we understand how it works? What can history and philosophy and art and these other domains sort of sneak into our brains through the back door that is ultimately going to undergird our capacity to sustain a career because we have learned to think in more than one dimension? Well, for all those listeners out there, lean into those curiosities and go meander. Brent, I'm sure we'll have you back sooner rather than later in the interim. Go Ducks. And thank you again for joining. It's been so much fun. Thanks for having me. at our website, lawfaremedia.org support. You'll also get access to special events and other content available only to our supporters. Please rate and review us wherever you get your podcasts. Check out our written work at lawfaremedia.org. You can also follow us on X and Blue Sky. This podcast was edited by Noam Osband of Goat Rodeo. Our music is from Alibi. As always, thanks for listening. Bye.