Is AI Eroding Identity? Future of Work Expert on How AI is Taking More than Jobs
59 min
•Dec 1, 20255 months agoSummary
Ann-Marie Amaphodon, chair of the Institute for the Future of Work, discusses how AI and fourth industrial revolution technologies are transforming work identity, organizational culture, and society. She argues that the real challenge isn't technology adoption but ensuring inclusive decision-making, long-term strategic thinking, and intentional deployment that preserves human dignity and learning opportunities.
Insights
- Work identity is eroding as careers shift from 'jobs for life' to fragmented, multi-role portfolios, requiring society to redefine what gives people purpose and belonging
- Only 2-3 proven AI use cases exist at scale (fraud detection, customer service); most enterprise AI initiatives remain theoretical, suggesting companies need to experiment with 'unsexy' problems rather than chase hype
- Organizational culture that values diverse perspectives and iterative learning from mistakes is the primary differentiator in successful AI deployment, not technical sophistication
- Systemic inequalities and data gaps (e.g., missing occupational data for pre-COVID marriages) will be amplified at scale by AI unless companies actively redesign norms and processes
- Leaders thinking only 3-6 months ahead miss transformative opportunities; adopting a 50-year legacy lens unlocks strategic vision and accountability for long-term societal impact
Trends
Shift from job-for-life identity to portfolio careers and gig work, creating psychological and social identity crisesEnterprise AI adoption stalling at proof-of-concept stage; proven ROI limited to narrow use cases (fraud, customer service)Growing focus on 'good work' frameworks emphasizing dignity, learning, and intentional technology deployment over pure automationQuantum computing emerging as next major disruption wave, yet receiving minimal attention compared to AI hype cycleRegulatory and stakeholder pressure forcing companies to adopt long-term, multi-stakeholder thinking (worker, company, society levels)Diversity and inclusion in tech talent pipelines becoming competitive advantage for innovation and problem-solvingData bias and systemic inequality being exposed and amplified by AI, driving demand for ethical AI audits and frameworksOrganizational cultures embracing retrospectives, experimentation, and psychological safety outperforming risk-averse competitorsYounger generation prioritizing climate, equity, and social justice outcomes over traditional career success metricsCritical thinking and collaboration skills becoming more valuable than technical skills as AI commoditizes routine work
Topics
AI Use Cases and ROI MeasurementWork Identity and Career TransformationOrganizational Culture and Psychological SafetyDiversity and Inclusion in Tech Talent PipelinesFourth Industrial Revolution Technologies (AI, Quantum, Big Data)Good Work Charter and Worker DignityLong-Term Strategic Planning and Legacy ThinkingData Bias and Systemic Inequality in AISkills for the Future Workforce (Critical Thinking, Collaboration, Creativity)Ethical AI Frameworks and AuditsGig Economy and Portfolio CareersGender Representation in TechnologyAI Hype Cycle vs. Proven ApplicationsRegulatory and Stakeholder AccountabilityEducational Curriculum Reform for Tech Literacy
Companies
Institute for the Future of Work
Ann-Marie Amaphodon's organization researching work transformation, good work standards, and fourth industrial revolu...
STEMETS
Organization led by Amaphodon working with marginalized youth in tech and STEM/STEAM industries to increase diversity
People
Ann-Marie Amaphodon
Future of work expert, tech CEO, computer scientist discussing AI's impact on work identity, organizational culture, ...
Geoff Nielson
Podcast host interviewing Ann-Marie Amaphodon about AI, work transformation, and organizational strategy
Chris Pissarides
Collaborated with Institute on Pissarides review analyzing job role transformation and skills evolution across labor ...
Caroline Criado Perez
Author of 'Invisible Women' cited for exposing data gaps and systemic inequalities affecting women in policy and design
Mar Hicks
Author of 'Programmed Inequality' referenced for historical analysis of women's exclusion from tech industry
Quotes
"It's not just about what happens to an individual, what's happening to a group, what's happening to a company, what's happening to a team, what's happening to an industry and therefore what's happening to society."
Ann-Marie Amaphodon•Early in conversation
"The proven use cases where we've said this is something that folks are able to say we're going to set the AI to do it and we're going to look away... it's fraud cases. It's customer service. And then it's not much else."
Ann-Marie Amaphodon•Mid-conversation
"There will be a point of no return where we didn't act as we should have in the time... But until then I don't feel like it's too late yet."
Ann-Marie Amaphodon•Closing remarks
"If you're not thinking about the long term, you're dealing with the symptom, not the root cause. You're not dealing with the wider set of things."
Ann-Marie Amaphodon•Strategic planning discussion
"Critical thinking is probably the biggest one... if this technology is automating or if it's being able to do better data exploration, critically thinking, which part of the processes that I'm doing at the moment require repetitive things that can be automated?"
Ann-Marie Amaphodon•Skills discussion
Full Transcript
Hey everyone, I'm super excited to be sitting down with Ann-Marie Amaphodon, chair of the Institute for the Future of Work. What's so cool about Ann-Marie is that she does it all and sees it all. Not only is she leading the charge on the future of work, but she's a tech CEO, computer scientist, and former child prodigy in her own right. I want to ask her how she sees the nature of work changing, what role AI plays, and what each of us can do so that we're not left behind. Let's find out. I'm here with Ann-Marie Amaphodon. Ann-Marie, thanks so much for joining today. Maybe just to get things started, I wanted to talk a little bit about some of the work you're doing with the Institute for the Future of Work. I know you're at trusty without organization. Maybe just to kick things off from your perspective, what are you seeing in terms of the future of work? What's your outlook? What's changing and what do we need to know? Oh, lots of things that we're seeing, lots of things that we're still figuring out. I think we could say as an institute, we're working through the details. As different things come onto the horizon as we start to explore different uses of what we're calling the kind of fourth industrial revolution technologies coming on on on stream, not just AI, but of course, big data, quantum, lots of other bits and pieces. So in terms of what we're seeing, I think there's a great kind of figuring out of how we're making all of these things work for the workforce. But I think there are two big things that I like to explore with folks based off our work. One is the levels at which these impacts are being felt. So a lot of the times when we talk about the future of work, we end up focusing on the worker and on the employee. And we tend to kind of, it's one of the lenses that we have over the work that we do, that this is not just about individuals, but this is about companies and this is about society. And so that's one of the one of the kind of big things that we end up trying to work through with folks is it's not just about what happens to an individual, what's happening to a group, what's happening to a company, what's happening to a team, what's happening to an industry and therefore what's happening to society. So three levels that we look at. And then the other big one that's maybe a little bit further off that I don't know if you have got to explore with folks on the podcast previously is also this idea of identity and work. And we've had this since right at the beginning. So I'm the chair now at the Institute for Future of Work. I also run an organization called STEMETS and we work with young people, young folk, young non-binary, young women, people that have historically marginalized from the tech industry and wider STEM and STEAM industries. And a couple of years ago, we ended up working in a part of the UK that historically had a lot of miners. And this is like mines, like coal mines, which in the UK is an industry that shut decades ago. And we were working with teenagers and it kind of came back as this feedback that, yeah, these are miners kids. That's why they're having this kind of response to the event. And this was kind of early, this is kind of 2010s. And I remember saying to a couple of people there, would you mean they're miners kids? Like the mines shut decades ago? These are teenage, like physically? How is that a... And of course, no, it was that they were kind of second generation. Their grandparents had been miners as their entire job for life. That was the entire identity that they had of these people. And so even though those children had never seen a mine and never been alive, whilst anyone in their lineage was working in mine, that notion of work for life was so embedded that not only was it the identity of their grandparents that were miners, but now they as grandchildren of them. And so it's something that when I graduated, we'd said that I think it's something like I'd have six roles in my career. And now our relationship with work is one that folks are kind of maybe even doing multiple jobs at the same time versus that one for life. And so actually, if we start to think of it that way, what's the new identity that people might have if work is not for life and is not, you know, all these technologies are enabling us to have different relationships with work, with income generation, with what we spend our time doing. And so that's the other big overarching piece that I like to kind of explore that runs away a little bit from the technology. But so much of this is about the impact of the technology rather than purely about how it gets deployed and gets used. And it's a super, super, super interesting piece. And I had left, I had sort of a sardonic thought, which is that I don't think my grandchildren are going to say they're like podcasters, children, you know, that either of ours are. And I hope they're not. Whether the minor's children are good or bad. But what are, Anne-Marie, what are some of the implications of that? Like what are we not seeing here? And what does it mean as a society for us when we move from jobs for life or, you know, my work is my identity to, I mean, it feels like the economy and, you know, tell me if you're seeing the same thing in the UK, but it's not even that it's shorter stints at organization. It feels like it's more part time, it's more piecemeal work. Like what are you seeing and how is that impacting people? So what we're therefore seeing what you're saying is there's maybe less, slightly less of a sense of identity and work, or there's a mismatch where folks have been looking for that. I've assumed that they'll have had that. And it's no longer being supported by the way that the industry is being transformed. So, you know, I don't know, as an example, we can say if we look at the legal practice, for example, the way that folks have kind of moved through the ranks, a lot of those kind of entry level roles are transforming, are morphing, and so therefore the way that we're evaluating folks to then be future leaders, you know, the legal industry is having to rethink elements of how that's done as we bring in the new forms of technology. I think the other thing that we are seeing and we're a big proponent is kind of where we started actually as an institute was our good work charter. We're definitely looking at how do we ensure that folks are intentional in the way that they're deploying technology in the workplaces to ensure that people still have good work that does promote dignity, that does promote learning, that allows them to have all these things that I think we might have had as part and parcel of jobs and the way that work was built in previous generations. But yes, if we remove this sense of identity, remove the way that things are working now, then actually we've got to be mindful of I'm deploying this technology in a way that allows folks to continue to learn. And so then that becomes the way that we set work up and that we instill that we're still going to have good work for folks to be doing rather than then maybe being at the mercy of technical and kind of robotic overlords. Right. Well, and it seems like it seems like we're kind of backing into something you discussed earlier, which is that there's the worker lens on this, but there's also the company lens and the societal lens. And you described it as sort of a sorting out period, but it feels, at least from where I'm sitting, it feels like we're increasingly at odds with some of these mandates as we sort that out and that what's best for the company doesn't necessarily feel like it's best for the worker or best for society. And there's this, you know, kind of this triplicate tension there. What does that look like from your lens? And, you know, can you tell me a little bit more about what the Institute and you are exploring? Yeah, I think I think it's it's a funny one you say it's that odd. I think it's something that because we've never had to explicitly say that's how we're developing work. Well, that's what we create and work to do. I think folks it ends up feeling like it jars a little bit to say, what do you mean good work means that this person is learning or that we're ensuring that there's dignity, right, or that they got access to kind of what it is that they need to do. You know, a good work was just that person got paid well and that was good enough. And so I think we're definitely seeing elements of that job. But I think what we're seeing more though is that a lot of companies, a lot of industries are trying to make sense of how they make the most of this fourth industrial revolution. What is the transformative? Everyone's saying it's going to be disruptive, you know, we're here talking about digital disruption, right? You know, what is that actually going to look like? And what we're seeing is that this is allowing folks to have a better way to approach the work. And so I think we're seeing a lot of people who are trying to approach this transformation in a way that they can feel at peace, maybe with the idea of legacy. I talked to a lot of audiences actually about this notion of legacy, that the decisions you're making today aren't just about the next quarter or the next annual round or the next year's, but actually these a lot of these decisions we're making now have this kind of 50 years down the line. Think of the person sat in your seat and the decisions you're making today, the norms that you're codifying, the intentions that you have for the industry, for the job, for the role, for the nature of the products and services that you're developing. If when you think about it kind of from a long term perspective, then actually it helps for folks to say, yeah, what part of this allows people to continue to learn, what part of this allows folks to have dignity. And so we're finding that organizations are thankful actually to have a framework to which to think about this that isn't purely about profits and more money. I think even more so there's so much rhetoric that we have in this hype cycle that ends up being quite dystopian and a bit black mirror or wally or you know insert other kind of dystopian end here of things are going to be terrible because we're just going to follow along with the technology and we're all going to end up killing ourselves by accident or whatever else it might be. And I think some of these things end up being checks and balances that folks can then feel a little bit better as yes I've used this audit I've used this assessment I've used this tool from the Institute I've read this this paper. I've understood some of the ideas that are floated here and the warning signs that we have and so actually any step that I make in the right direction any set line making this direction is going to be in the right direction and is going to, you know, prove a good one in the long term. So that that's what we're beginning to see with people genuinely wanting to work out these details and whether that's unions whether that's regulators and whether that's academics whether that's in the you know we we work with so many different organizations and types of stakeholders that we're able to have this kind of holistic view of what good looks like in lots of different places and spaces. If you work in IT info tech research group is a name you need to know no matter what your needs are info tech has you covered AI strategy covered disaster recovery covered vendor negotiation covered info tech supports you with the best practice research and a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe. I want to clarify a little bit because we're talking about this notion of the fourth industrial revolution we're talking about all this change and the need to do right and by the way I love the I love the framework around legacy for a lot of reasons that I want to get into in a minute but just before we go there. When we talk about this fourth industrial revolution like is this 90% AI is a 10% AI like aside from the AI piece like what what are some of the other big drivers in your mind of what are shifting for the for work. It shifts it shifts and it differs so AI is not 80% not 90% I'd say it's probably more 60% AI I think quantum is definitely on the horizon we've had big data for quite a while. And we do a lot of kind of physical manufacturing and kind of start pieces pieces as well. So for us it's rather than focusing specifically on on the technology across the kind of wider set of research that we're doing we've gone quite broad. And I think we've had to do that because of the longitudinal nature of what we're trying to establish or set up I think if we got caught up in AI and we'd end up restricting really, you know the the impact we're able to do. And so we've had to have but also the quality of what we're doing so I think with AI actually there's quite a lot of things I mean when we started, you know there was also a notion of what is AI and what's not AI right a lot of people were kind of taking statistical analysis and calling it wrap back packaging is AI and kind of getting more funding and getting more eyeballs on it based on that. So I think we kind of wanted to rise a little bit above the hype and and not be kind of AI specific quantum is one that has something I was reading yesterday I'm still kind of surprised maybe not. I mean, I'm more surprised that I'm not that quantum isn't featuring so much in conversations now as much as AI is the research is going on. There's supposedly updates and progress that's being made on that front and it is going to be incredibly transformative. Once this goes live and once we really do have the kind of the we've worked out all the elements of what needs to happen for it to be kind of stable and and at large. So I think for us, yeah as an as an institute we're wanting to be broader and look at kind of all manner of different technologies in the here and now. Sure. And you know you mentioned AI hype, which I think for anyone even tangentially tied to this space is just impossible to get away from these days. And so I'm curious from your perspective, Anne Marie, what are the things you're hearing in this sort of hype sphere that from your perspective and what you've seen on the ground like what BS versus what are the pieces of it and the you know implications and implementations of it that you really think are going to be transformative and disruptive. I think is really funny about this is that so much of the BS is on the potential right and I get to work at the Institute and in other work I do I get to work with all manner of it's a way with governments right I get to work with NGOs I get to work with industries kind of various different ones. And what I find funny slash frustrating is that the use cases the proven use cases where we've said this is something that folks able to say we're going to set the AI to do it and we're going to look away we're not still trying to figure it out. We're not still trying to make it happen. We're not still trying to realize have kind of kind of real life have benefit realize ROI realized is kind of fraud cases. It's customer service. And then it's not much else. So I think a lot of things are trying to use it for creative and a lot of folks are trying to use it for tagging kind of en masse but maybe that's another one that I've seen but again the tagging as a use case kind of doesn't stand on its own it's kind of then part of the wider process, maybe fraud, but fraud is one customer services the other, and then folks tend to go quiet. I've seen all levels on what else we're really getting the AI to do. And so a lot of it it's not that it's completely BS it's just that it this is so it's still so theoretical, or it's still so early on that no one's been able to genuinely realize that benefit draw it to the ground and say, from that because we've done it fast or because we've done it with less resource of your resources, we no longer pay for this. Or we, the speed at which we're able to serve our customers is now this and that's been realized on their end. I think for me it's quite frustrating actually not getting, not that folks are struggling I guess to identify that and stick to it and have it repeat again unless it's customer service or fraud detection. It's really interesting and I'm you know I've seen something similar with a lot of the organizations we work with over here but I'm curious when you see that and when you see that it's you know one to two things that companies or organizations and geo's governments are getting value out of what's typically your advice for them is it you know you're but not looking hard enough go back out there. Is it you know go all in on AI is it okay this makes sense for now let's wait until we find more promising use of the technology. What do you typically tell them. So my advice is I mean not it's not as negative as you're not looking hard hard enough I think my advice is there are a lot of exciting sexy things that folks the big big tech thinks are the problems. And so go look for the unexciting things one of my favorite ones recently has been Potholes the AI of Potholes. Who's thinking about that talking about that I know the big folks there's not a huge amount of money supposedly in Potholes, but actually if you're able to sit down and think about it and the folks working in that space really could could focus on what is the data that we have what's the logic that we have around this. What are what are the learnings that we have and we can look at the air Potholes it's almost like that kind of what's the unsexy stuff. And then how do we start to experiment with that and you have to do those experiments and you have to make those mistakes. In order for you to then have that kind of first movers advantage. And so that's what I end up talking to folks about is there will be things internally that are almost hidden problems that you have to then go and work and experiment on and you want to make quality mistakes as the other notion I'm talking to audience is. I think I've talked a lot about a give the example of there was a time before cloud when we'd have to save files as we went along. Right and most folks who are old most folks, you know, of a certain age will remember you'd have to control s or command s as you went. And God forbid you forgot to do that or your file was corrupted then you'd have to rewrite your document entirely from scratch. Right and you'd kind of do that and you'd call it final. And then I'd know you'd use the Lou the toilet you'd come back you'd realize there was like a heading at the end that you changed. So you change that and you'd call it final final. And then maybe your mum would call you and why you're on the phone to her how much something she says triggers your mind and then you call this final final final. And most of us would then be like brilliant we're done you'd send final final off and then realize no no no final final final is what you should have sent right. And I know we're reaching back into our memory banks. Then the next time you sit down to write document not only you control sing compulsively the whole way through but now you've called it final I don't know fifth of January 2026. Maybe that's when you're listening to this podcast. Right and then you're like okay cool. So now I know which final it was I'm going to take it off the date or if you're particularly sophisticated you start a versioning and you have version and you don't even use the word final. So I'm going to take some folks of this example that there's a journey you've been on which means that now you don't use the word final you've got a fairly sophisticated versioning system that you're following along with. But actually you wouldn't have gotten that if the firewalls and corrupted the first time. And I talk about now when we're using AI in the same kind of way that there's going to be so many different elements that you want to try and use it on and then you remember now. Actually AI doesn't understand humans. So what's the non human element of what we're doing that I can train this on for me to then get the results that I want to have or you know all those kinds of things and so I think there's a high quality mistakes that folks need to be making. But as you do that you're going to make that you're going to have that quality of learning that others don't have and that's going to be your competitive advantage in making the best use of AI. And so that that's where I want to end up talking to folks about. I love that approach and by the way I'm shuckling as well. I like very I like you know getting like PTSD about underscore final underscore final the one the one that the one that really gave me like philosophical pause was I once had someone sent me underscore underscore final underscore final underscore new and like to me that that that naming convention tells a story of like oh my God what have we done like we thought we were done and we've gone back to the drawing board but but all of this all of this and the reason I laugh is you know as I listen to you talking about finding your own mistakes and looking in the unsexy parts. It feels like it's very cultural within an organization the ability to do this successfully what does a culture look like in an organization that you find is going to be successful with this versus unsuccessful. I mean it's a big point and and it's maybe something that again isn't hasn't it doesn't ever it's not an overt kind of end of messaging of kind of you need the right culture in order for these things to do well. I think it is those cultures that are open to those mistakes and are open to that iterative approach, which I think if we're working on the agile framework and you're particularly technical you will understand what that means if you build in that you're going to have a retrospective. Right as you're going that means that you're saying we are going to have that we're going to have to take the opportunity to reflect back on because there will be things for us to reflect on there will be things that gone wrong things for us to learn on or things for us to measure how that has gone and then have a higher quality iteration. So I think the organizations that do that are the ones that do well I think the other ones are the organizations that culturally value difference as they're going through these processes so the other example that I have and any time anyone gives me a mic at tech conference is my favorite example to give is that of periods periods menopause pregnancy periods right. These things that we don't talk about often enough a taboo everyone's afraid of them and runs away from them but so many of us have to deal with these anyway on a regular basis. Every couple of years the health tech industry discovers the period for the very first time. Right so five years ago I won't name who it is because it's been recorded if you read my book and she's in control. I have the example in there. There was a big health tech company discovered the period for the very first time and allowed users to track 10 days of their period as a new feature on this fitness tracker. Which some folks listening will think oh my goodness this is why I'm you know switching off switch it off the women's talking periods again. Others are thinking 10 days of a period what's the significance and then you have to ask yourself the question is it that at these health tech companies. Or this one in particular that had the 10 day period that they'd never met anyone in their team who'd ever met anyone who'd ever had a period. Or was it that they had a couple of people there who might have known something about it but what they were contributing or what they had to say wasn't valued in those spaces. And if you're literally working on a period tracker in this company and you can't hear that from them what else have you been literally working on that you've not been able to hear from those people. And so actually the right kind of cultures allows those nuances and those differences to come to the fore and it's not just about periods it's not just about gender we all matter all things. I mean another one I love and I love to reference is legally blonde the great 2001 film where spoiler alert she does really well in a real law case based on her in depth knowledge of perms. And again these things feel so tangential but actually is your culture allowing different people to show up in the different ways and the different experiences that they have because again a lot of these spaces. It's a very narrow set of life experiences and so very narrow set of solutions that we end up proposing to problems whereas actually something a lot broader allows you to make the best use of the AI because you're not focusing on the same doom bloom and tiny set of use cases that everyone else has already done to death. I love the examples and and who knows what if listeners are like I close close close. So when I hear the period story but I like it and I think it's a really really I feel like it's a really important point because culturally it's not just it's not just an impact within the four walls you know physically or metaphorically of your organization it impacts your customers and you know unless your customers all look exactly like your you know your CEO or whoever your you know founder is it's important to have those it's important to have those voices right and how can you how can you build that into the organization itself. So is this something you help organizations get better at I mean what one of the adages about culture is that a culture is notoriously difficult to change. Do you work with organizations on cultural change and what does that look like typically. In some ways I do with the Institute for Future of Work is definitely something that we are doing with with the research that we do as a lever. I think for me as an organization I run STEMET so we work on the kind of attraction of different types of folks into the industry. But then ad hoc there'll be particular partners and particular pieces of work that we put out yet to help folks think about what is that what does that culture change look like and you're right you know the culture is the average of everybody's actions. And so it is about how do we empower everybody to evolve I guess and again like I said use a legacy lens perhaps on actually 50 years down the line but I'd be proud of this decision that I've made and the impact that it will have had. And so that's something I end up working folks but it's a really so really tough one it's a tough one to do is tough on to transform I think even more so as a classically trained computer scientist I know it's definitely something that wasn't there and hasn't been a core part of what we think knowledge is and kind of is value across the technology scene. So it's definitely also something we're looking to see how do we change the systems and the structures to make sure it's part of the curriculum that's part of you know the way people learn about computer science and learn about a lot of these things that they understand also the implications that it's not just about deterministic not just about the numbers you know life is so much more complicated than the maths. And so how do we reflect that in the way that we build and that we do and that we deploy. Right. And you know you mentioned you know being classically trained in computer science and I know some of the work that you do is as you said with STEM at and beyond with women and you know traditionally marginalized groups in computer science and in some of these organizations trying to get more more of them into the field so I mean again this is probably too big a question to bite off and in one go but what's your sense about about why women have been historically underrepresented in tech and I mean does it matter and what do we do about it. Oh I mean there's definitely I don't know how much we have how much time we have on this I think women have been historically marginalized from the tech scene and lots of different elements of our kind of heritage in the tech scene mean that we've taken some of this unhelpful elitism maybe from from maths you know we came as a field as mathematics academic mathematics and kind of take a look at what's going on over in that scene I think there's also this kind of myth of the kind of the lone genius and people that are born with this knowledge and you know if you're born with this knowledge and born with this these things then you look a certain way because that's you know all part of the kind of the adjacency we have to you being what I sometimes call dead and white and male with a beard and you know you must be all of those four things dead included for you to be someone that can be revered right and so we're all aiming to be that dead genius who you know created something that's now kind of transformed the whole world which I think is a high bar actually to say you must be dead in order for you to kind of contribute to technically I think that's really really we don't grapple with that often enough and but I think there's lots of geopolitical notions Mar Hicks has wrote written a great book programmed inequality is kind of there's a lot lots actually the face and kind of read to delve into why this is something that's happened again and again and I think if I'm gonna be completely honest up until this point it was almost okay because so much of what we're doing technically was very niche and for a certain set of people you know in a certain space and we kind of just made it because it would be cool wouldn't be cool to have a website that rated whether people were hot or not and we kind of just built that and and and I don't think any of us really grappled with the fact that you build that website and then all of a sudden you're moving elections and you're transforming democracy as we know it and so I think it matters because if we're building things for the world these women are part of that world and so why wouldn't we have their experiences reflected there and in what we're doing so we can sell more if we really want to but also I feel that we end up creating more problems than we're solving in the way that we do and we're not employ technology at the moment because we don't have different types of voices taken into account in what we're creating and ultimately you know I am a technologist because of the altruistic strain I have in me you know if I can create this thing if I understand how it works the logic and then I can solve that problem again and again without me needing to be there and so I think for me it's it's only fair to the technology right that we actually end up building things and we make the right kinds of decisions what we're prioritising and how we're building it and what we're creating and what life experiences that we're reflecting it and so that's why it's important to have women be a part of the other puzzle but also other groups right and you've got gender you've got races all manner of different folks that should be part of it given this is now global tech that we're building at any given point and there are so many problems to be solved in the world without us creating new ones. So when we talk about solving this problem and how right like how we create a more equitable or democratized environment where everybody can contribute through these technologies and with this technologies and you don't have to look a certain way or see yourself compared to a certain you know reverential figure how do we get there and what sort of work are you doing to get folks there I mean I love the idea of you know destroying this narrative of you know the dead white guy with the beard and you know if you don't look like. Albert Einstein or whoever else like you can't possibly be successful in this is it as simple as you know just having more certainly role models that look like you know you look more diverse is important. And what else is it are kind of the key factors in your mind in terms of making this a viable path for for more young people who wouldn't necessarily have seen themselves in these roles. So it's fun that you start with Ronald was kind of in your question I think it's ensuring that we have the right environment for those role models to exist. And so I think yeah yes we need the stories to be told of the of the different types of people that are thriving in the industry that are creating that have already made things you know whether it's GPS. Right whether it's you know the insert insert name of any of the things that she the women have created Kevlar. There's kind of a long list actually of women innovators who've done things already who we haven't eulogized in Hollywood movies and who stories we haven't told again and again who have died off right and and actually we've not given them their flowers before they before they've gone. And so there's a whole lot of stories and you know I didn't pull Hollywood or anyone that's listening that has that ability to tell those stories to go and look up their stories and try and tell lots of them you know there's there's quite a long list that we have or the Stephanie Shirley or there's God Gladys West. Catherine Johnson we've now actually seen but there's there's always more we've had so many finds time so many of the others. They're actually less let's kind of fill up fill those in so I think one is to tell the story. But I think the other thing is to genuinely work on ensuring that different types of people can thrive in our spaces that then it doesn't need to be. Again that loan role model those that kind of list of people that can be just something that hey, name a female journalist, people can do that the same way they can name a male journalist right. And so I think we need to have the same sorts of spaces. I think we also need to be able to hold folks accountable for bad behavior, which you know when we see the use of NDAs across our industry when we see the kind of the norms we have as to, you know how people get equity in particular companies how people are held accountable for for like I said bad behavior kind of kind of corporately versus others I think there's a lot of kind of bad habits and a certain type of person that gets protected in these spaces and that's academically as well as in industry entrepreneurs as well who's allowed to fail fast and often who's not allowed to fail fast and often. And so I think there are quite a lot of tweaks actually that we do need to make to ensure that different types of folks can survive. Another one would be, you know, and you have this to a lesser degree in the US but it still does show up in the UK we have it that you make decisions on particular subjects you want to study at a higher level at 13. And those decisions follow you throughout the whole of your life, which I mean I don't know I don't know if you're even eating the same food that you act as a 13 year old, let alone your entire career and you know, go forbid you made the wrong decision at 13. Hair color, what you're eating clothing wise or whatever, but still that we have an entire industry that no no no you cannot possibly be an engineer here because you did not make that decision at 13. And so there's there's nothing of value that you have to contribute here because you weren't didn't enter onto that pathway and and again it speaks to this idea of you have to have been born that genius because then at 13 you would have locked in those ideas. Was actually there a lot of folks who have a lot of things to contribute that we're doing other things at 13 and we should value that actually in what we're creating what we're doing so I think it's it's updating the the culture so that we can have more of those role models, as well as telling the story of the role models who have somehow made it through, despite how unfriendly the industry can be. So, I mean, I'll tell you off the bat that 13 year olds are not typically the target demographic of this podcast, but I am curious because I think I think a lot of I think a lot of listeners do potentially have their own 13 year old or you know, own teenager in their house which is what what do you tell, you know, young people these days about what the future looks like what they should do, you know, how they should be thinking about their own future and their own legacy. So we estimates have ended up working this notion of stem, a steam sorry rather than stem so including on design in there and I think what you do end up telling the 13 year old is a lot of these things we haven't figured them out. You know, as we as we started at the top of this conversation where the beginning of this fourth industrial evolution, a lot of the answers we don't have the adults. There's a lot of things the adults do not know. And so we end up talking to them about okay, you know, what would be your idea what would you create what would you prioritize. How would you see this and we end up relating it as well often to what they're already interested in so you know not AI for the sake of AI but how does that impact the football team that they support or how does that impact on the clothing brand that they that they love to wear or how does that impact on the food that they like to choose. So we start with what they're interested in that kind of related the technology to that but in quite a big way. I think this this generation have grown up quite different to previous generations in that they've got to see the problems of the world up close in real time. And so I think the notion of what success looks like is quite different to them. And so we end up talking to them about how does technology how much technology and power that version of success how much technology be something that helps the climate change how much technology be something that helps with fairness and equity and social justice. And so that's what we end up talking to 13 year olds about and leave the door open to say hey like study history study languages whatever it might be but understand that technology is a tool that will have a role to play and whatever you end up doing. And so you can choose to study technology that's completely fine or build your tech literacy so that no matter what you do you're able to apply the technology in a way that makes sense to you and solves the problems that you have. And so those are the kinds of conversations controversially or not that we're having with them it's not everybody learn to code but it is definitely learn to have that critical thinking learn to collaborate learn to create because these are things that you'll still need to do no matter how many of our jobs the robots take away. I'm I love that answer and it preempted the next thing I wanted to talk about which is you said you know critical thinking you know collaborate create. Are those the question I wanted to ask you is for those coming back to the notion of the workers who are facing you know the the the reality of you know coexisting with more and more technology that can do things that could never do before. Things that could never do before what are the the critical skills for you know the the roles of tomorrow and for the to be kind of a successful worker or leader in tomorrow is it those same three or how would you frame that out. And yes so it's definitely something so we at the Institute recently completed something called a Pissarides review so we did this with the Nobel Laureate Chris Pissarides looking at kind of the transformation of work and jobs and how they're all being transformed at this time. One of the things we did was a huge analysis of job roles that kind of job openings across a big engine and looking at over time how the skills were transforming and changing across this kind of big data set. And what we saw that yeah well we've got AI and maybe data skills that are more prevalent across the different job descriptions that have gone out this idea of communicating of collaborating critical thinking of creativity these sorts of skills, which some folks have called soft skills I don't like that core skills I think is is really what we're talking about. If you want future proof skills, those notions of synthesizing the information and connecting with an understanding as well as communicating. I think those end up being skills that folks will need, you know, no matter what the technology is doing, and, and you can have them augmented by the technology those things can't be replaced by the technology I mean we've all seen what gender to be is spewing out now we're calling it slopped. Right, it's the kind of the overarching term. And so as a human being, yeah, the quality of slop that you would have put out manually is probably slightly different to what the AI generated slop is doing is creating, but critical thinking I think is probably the biggest one, right and and and it's possible that some of this is missing and that's why a lot of those for example AI use cases just aren't hitting, because folks aren't able to critically think and say, if this technology is automating or if it's being able to do better, better data exploration or if it's generated like critically thinking, which part of the processes that I'm doing at the moment or what part of the value that I'm providing to my clients or my customers at the moment require repetitive things that then can be automated or who has a lot of data that needs to be explored, or is going to support me in making you know higher quality decisions or whatever it might be so I think critical thinking and say okay just because they I said this should I believe it. Critical thinking to say when I give it this prompt when I give it that it gave me this back so actually what what really is going on here. I think there's a lot of that that ends up going missing that will stand folks in good stead and allow them to explore and make better use of the technology because as much as we have AI as this hype machine as this hype cycle sorry and as much as you know we can talk down on it. Actually the AI is good at particular things. We just have to spend the time to say where does that overlap with what my specific use cases and work is. When you talk to leaders then who are looking at the technology, they're ideally thinking critically they're they're looking at the the organization of the future the workforce of the future. What's your best advice for them and are you finding that they're typically you know thinking too small or thinking too abstractly what are some of the biggest mistakes you're seeing. I think that they're thinking too short term. I think it's the big mistake that I see from businesses at the moment. There's a lot of this is the problem I have here and now and you're kind of dealing with the symptom. You're not dealing with the root cause you're not dealing with the wider set of things. The kind of the example I almost always give folks is to bias comes up and folks always end up asking me oh you know what can we do to mitigate the bias in the AI that we're building. I'm not working cool. Why don't we rewind. We can build an AI that helps predict and analyze whatever the best sandwiches for us to create and to have on the menu at our downtown Toronto lunch space place right. And we can say we can build and fine-tune the AI to the nth degree. Right and say this is the best AI that's ever been known for anyone to build to choose sandwich flavors. But there are a lot of people in Toronto who don't eat sandwiches for lunch. I said you have to think a little bit but you know like you've prescribed this you've set this problem you set this scope. You don't you're not thinking wider you're not thinking 10 then this is why I say 50 years down the line when you do that it allows folks to zoom out a little bit and actually not focus on. The current pain point that they have or even the next pain point but to say let's go back to your first principles and really think about how we can do this in a transformative way. And I find that a lot of business leaders are held by the neck by shareholders stakeholders you know whatever the shorter loop cycles we are that we're operating in. And so actually the folks that do the best are the ones that kind of can pull back and say no we're going to take a long term view on our very niche industry. And again this is not about tech this is about other places. So it's okay in education let's take a long time view what is the point of education. Therefore are we going to use AI to help teachers in the here and now deal with admin in the current system or we're going to use AI to completely transform the picture that we have of skills and knowledge that our learners have. And if you start to think about that then you work on a whole nother set of problems you're working at a whole nother level and then you're able to kind of fundamentally transform the way we're looking at just the admin that they're having to do today. Which is a tiny use case and you know it's ended up not actually necessarily giving the right kind of pay off for folks but when you start to think about assessment you start to you know and so I think it's that strategic thinking long term and then working your way back then allows folks to have the vision to have a little bit more motivation. And allows them to maybe have a bit more room for the kind of accountability on learning from mistakes as we mentioned already. And that's what we're seeing is working for for folks when they're able to look beyond the problems of the here and now. I love that approach and it makes I mean to me it makes complete sense and I can you know you say and it's like yes like I'm 100% bought in one of the things I've seen is a lot of leaders seemed and a lot of people frankly seem to have a. Convenient belief right now that we're in such a period of disruption that the long term is completely unknowable it's just like who knows we can't possibly know it therefore we can't plan for it therefore. I don't have to look beyond the next three months because that's all that matters when you look out over the horizon do you give much credence to that or do you think that's just an excuse for. You know not having to do the the thoughtful work required to plan. I think it's fair I mean we're any human right and so I think you've things you can't control you fear I think my biggest response to that though is that you know there's less there's less fear if you are in the driving seat and if you make some decisions and if you lock some of these things down now. And you do have the agency to do it because of the time that we're at in this cycle there will come a time when maybe it is too late and too many of these decisions have been locked in and too many of these norms have been set but that time isn't now. And so I do get folks excited you know do you like to you know there's so much you see in science fiction right that's ended up being real I mean. Kit night rider from the 1980s I don't remember much of the 1980s right but David Hasselhoff played this guy called Michael he talked to his car called Kit. And it was such a far off thing in the 1980s from to talk to this car and the car to hear him understand him then to successfully fight crime together and always get folks to think that we talked to our cars now. In 2025 soon to be 2026 whatever you're listening to this there are people for real talking to their cars there's no science fiction thing. They might not be fighting crime maybe they're you know turning up the temperature or changing the track right but the car is understanding them. And so I get folks to think like this doesn't have to be a big major huge shift that you invent the internet in your organization but there will be small tweaks based off desires dreams imaginations that you have and so you better imagine. We were human beings all of us have desires all of us you know things that we wish we could have instead and maybe not all of them are the most worthy of desires that we want to have but actually across an organization you'll have a lot of people. They'll have a lot of ideas on what could be next and you could definitely make that I mean if we if we don't if we don't you know assume that something we've got within our gift and why do we bother. Getting up to work and getting up to create anyway. It's a thing to be human to dream and to hope and to explore. So yeah I mean that's another thing I get folks is you know right if you're not a sci-fi person right your version of sci-fi. So 200 years down the line what would be what you in 200 and you know to 2200 and 25 or 26 whenever you're listening to this what they'd be doing just write that what would you imagine. Because none of us know completely what's going on but all of us can dream and can explore and can hope. But I understand the fear it's very human to be to be afraid of the unknown. We've talked to it may it makes sense and it's something that I think has you know is being profoundly felt by a lot of people right now we've talked at length and Marie about kind of the worker piece about the organizational piece. We haven't talked a lot yet about kind of the society piece which you said is sort of the third leg of this stool and in some ways is maybe the most difficult to nail down but also the most powerful. What are some of the big I guess risk like change risks of society as well as some of the benefits you see kind of emerging that you know maybe we're not taking seriously enough for not planning for well enough. So I look at this from the social justice lens so I think I think the biggest risks that we have at societal level is that we forget or we ignore or we are dull to or done to the social inequalities that we've already had thus far. And it's a frustration that I have in quite a lot of the discussions that I have with people and with technologies in particular said just earlier you know life's more complicated than the maths. There are a lot of data sets that we just don't have a lot of things that we just don't know about people and have never known and those in power have never haven't deemed important enough for us to have a very UK specific example. Before COVID when folks got married they entered their names onto the married register. You know marriage register we had we have a central marriage register in the UK before COVID they asked you if your father's occupation only if you've got married since COVID you've entered your mother's and your father's occupation onto the marriage register. So if you're at the whole of eternity pre COVID. That's just information we do not have for people that got married. Whether there's something wrong with that or not. Let folks unpick. But actually when we then build new models and we take in data sets from lots of different places without asking the question or being blind to gaps that we've had thus far that we're only going to repeat the issues and the problems that we've had already but we're just now going to do it at scale. Millions of times a second whereas before I didn't know in medieval England that was just something that when someone went back through they didn't have the information. And so I think I think there's so many things that we have to side to leave these are real people these are real things happening in real spaces that people should really be doing something about another example. Invisible Women and another book that I always talk to folks about and if you haven't had Caroline on you should definitely have her because it's fascinating some of the things that she was able to unpick and kind of communicate in that book. Another example that I would love to share from that book is single parents. So again this is a facet of life that actually you know single parents people being widowed whatever reason it is that you end up as a single parent. That's not a new notion but whether that's a concept that we've had kind of reflected in law and spaces whatever it is. No right and Caroline gives us example of someone who's a director at a company and also a single parent. So the people who are directors at this company they're all invited to this dinner. They have a look at where it is they or their assistant book themselves into the hotel next door. Let's say it's $200 for the night for them to attend this dinner and they tick accept and that's good that's done booked on the company card. The person who's a single parent doesn't need a hotel room they need childcare for that night. Let's say quite expensive but whatever $200 or less on the childcare for that night and that person takes accepting on the invite. And this will differ I'm not sure I guess in the US I think it might be the same actually. In the UK the hotel the $200 on the hotel is an allowable business expense. In the UK the $200 or less on the childcare is not an allowable business expense. But it's the same amount of money being spent by the same company for people at the same rank to attend the same event. Why does it matter and what have we not built. And so I think this is the thing as a society is again there's still these all these little things to unpick of us doing better for different types of folks. And like I said solving problems are faster rate than we're creating them in all the other ideas that people are creating and products that people are kind of pushing out to the market. And so I think society again there's an opportunity and with AI in particular I think this is something I've been really excited about is so many of these things people have had a hunch on and they're like yeah that doesn't seem right. That doesn't seem fair. That's a gap where it's not AI is helping to show a lot of these biases that are in real life. Not necessarily really for us to mitigate in the eye but for us to mitigate in life mitigate in the way that we run our societies mitigate in the way that we are building our communities and our groups. And so I think you know that there's it's an opportunity there's an opportunity for all of us to do better for everybody and for all of us to gain from it. I like the examples that you use precisely because they're so the scenario is so obviously dumb right like the example isn't on the scenario is dumb that how can we not just be better at this and I think for all of us you know in some capacity we've encountered this and we just say like why is this fair why does this make sense why am I being penalized for this or why is that person being penalized for this. There's certainly a lot of intractable problems but I to your point I think there's a lot of problems that are not at all intractable we just have to say that this does not make sense let's figure out. Yeah let's figure it out. Let's figure it out. Is that you know a mindset thing because again there's yeah I don't know I feel like there's so much talk about like oh no that's like that's not my job or you know that that should be government. That's not me is like a business leader or in the government being like that's the business leader that's not me or government like is this just a matter of all of us kind of pulling up our bootstraps and saying let's try to make the world a better place or what's kind of your best guidance for actually making some of this happen. I think so I ideally that would be it right we'd all say let's all do better for everybody and we all work together. There's a lot of it that I see just ends up being norms. So we have so many unintended consequences from I don't know who very existing has meant that the way that different folks like the laws around gig gig work have transformed in so many different countries right as a result of existing. And so I feel like there's there's we haven't kind of kind of scratched or use that lever as much as we can have. Hey these are norms that we could build in the technology that might just solve this problem. That means that then actually you know a woman bearing the title doctor doesn't get locked out of her gym locker every time she goes to the gym. You know I and there's so many like there's so many small bits that it's almost like soft to small bits and yeah government and business eventually can solve the big but the tiny is the small irritating pieces. We can definitely just say that something we're not going to do on this platform or something we're not going to have going forward or something in our company we're going to set as a norm that then means in our supply chain and in our customer chain these are things that then in our ecosystem that our sphere of influence that we have that then becomes part of a norm that we've upgraded and we've changed. And so I think I think it's a combination of both but if everybody took it upon themselves to understand this fear of influence and then to implement things properly across there, then actually there'd be a lot of good things that would happen. I think the other thing would be potentially that you know if we go back to our point if you have different folks valued and different face have access to make these technical decisions that an upstream would mean that we have better solutions for things, which I think is the other thing that we we end up missing out on and we probably see it the most in health examples of the different things that come up from pharmaceuticals because of who had the most money who they thought were the biggest market chair, rather than the actual solving of a problem that we've had for ages that endometriosis is not a brand new condition. And so someone should just have that as something they're going to work out and they're just going to do and solve in the in the small with the use of this AI and it's why I get excited about things like AI for potholes. So there are so many problems that actually the more that we have this accessible the more that it's democratized and more that folks understand they can just experiment. The more of new types of ideas and solutions we might get to see. I mean a couple of years ago I had someone come to me who had conceptualized a new social media platform type that centered on the idea not the person. And I remember thinking yeah I give you in those rooms at the beginning when we're doing bulletin boards and all these other forums you wouldn't have centered it on the user or would have been another way that we cut that information that we Center on the thought on the nugget of the idea and how many lives could have been solved I mean you know all these things that could have happened if we we just had something slightly different. And there's nothing to say that if that person wasn't there at the beginning. That couldn't have been something you know cyber bullying in the way that we know it could have actually maybe just we would have just missed that for just having something slightly different there at the beginning. And so it excites me excites me the capacity we have to solve problems by taking a slightly different approach to the way that we engage with technology. Well and it excites me that it excites you and that's something that that is is you know really is really great to see because not not everybody I talked to is this excited by it because I think you know it's so easy to look out over the horizon and see so much change and so much uncertainty and you know there's obviously no shortage of things that aren't even going well today and there's you know things that will be going better but and things that will be going worse but it seems like you know net net your you know optimistic about now. Yes. Well so so tell me more. I'm optimistic for now there will be a point I'm sure of no return where we didn't act as we should have in the time and you know we're much further along into this next in reflection. And you know the norms have been set and the dies have been cast and you know we've locked ourselves out we've shot ourselves in the foot at which point then I will run to a bunker. Or run somewhere I know East Coast of Kenya just going hideaway and do maths on the beach forever retire to just do just do solve math problems on the beach. But until then I don't feel like it's too late yet. So now's the time to set the norms to influence you know some of these you know cultural changes and decisions and try and try and build the legacy that we want. Exactly. Exactly. Now's the time to try to genuinely do better and be inclusive of folks to help us do better. I don't think it's too late for us to edit things change things. And we see this even more. I see this so much at the work that we do at the Institute that the fix is still trying to figure out these use cases they're still trying to work these things out. So it's not it's not set in stone yet. There's definitely a lot more disruption on on on the way a lot of these kind of white collar industries that are kind of having to change how folks kind of enter. So there's there's a lot of disruption that's happening. I think it's when when the dust settles on that then might be slightly closer to me saying maybe it's too late. And I'll ask you the question I'll ask you the question that everybody hates which is what's your kind of predicted timeline for when the dust settles on that is that 2027 is a 2030. I know I don't think is a time thing. I don't I it's not a I don't think it will be a time as we've seen with this type cycle actually it's not just about and and we know this kind of classically you know this about technology anyway right. It's not in the presence of the technologies and the adoption of the technology. And so and you can never really tell. I don't know. I don't know how many of us could have predicted that we'd be this deep into the AI hype cycle still here and now three years ago we would have been blockchain and Bitcoin. I'm guessing maybe in five years the quantum volume will have come up ever so slightly. But I don't know that it's a particular time that when it's the technology that's probably relevant. I think now again as a classically trained computer scientist is it's less about the tech it's more about the adoption and the kind of the socio economic scenario in which it's being deployed and it's being used. I don't think it's a time thing. I think there'll be there'll be some particular markers and particular norms and which which soup tech superpower is using technology for folk to what end. It's not so mysterious doesn't it. Wow. It's an internal feeling. She just started a website. Where is a am I am I from the beach or I'm not I'm not on the beach. I'm in the bunker or I'm not. That'll be how we know the time is in the day. The day that I booked the ticket one way tickets against it on the beach. Then everyone can hunker down. Yeah. If you're on the beach we're in big trouble. Exactly. Yeah. This is it. So just come sit on the beach with me instead. There you go. It sounds I can think of worse things. Hey, Emory, I want to say a big thank you for coming on the show today. That's been really insightful, really interesting. Lots of great stories and gives me lots to think about. Thanks for having me Jeff. Good luck everybody. If you work in IT info tech research group is a name you need to know. No matter what your needs are info tech has you covered. AI strategy covered disaster recovery covered vendor negotiation covered info tech supports you with the best practice researchers. And a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe.