#328 Transforming Learning: The Role of AI
34 min
•Feb 24, 2026about 2 months agoSummary
Dr. Darren and returning guest Dr. Karm Taglienti discuss how AI is transforming education and organizational learning. They explore moving beyond traditional testing methods toward practical, application-based learning that develops critical thinking skills, while leveraging AI as a tool to accelerate understanding rather than replace rigorous academic thinking.
Insights
- AI can serve as both the disruptor and the solution—organizations can use AI techniques to address problems created by AI disruption itself, creating a cyclical but productive problem domain
- Traditional assessment methods (quizzes, essays, standardized tests) fail to measure true understanding; real learning requires demonstrating ability to apply concepts in real-world scenarios
- The widening knowledge gap poses a societal risk: as AI amplifies capabilities, confident misinformation spreads faster, making critical thinking education essential to combat 'AI fakers'
- Engineering-focused practical learning (why and how to apply) should complement but not replace theoretical rigor; most professionals need 'enough' theory to use tools confidently, not deep expertise
- AI can democratize learning by serving as a co-pilot for critical thinking—students can ask 'why' repeatedly to deepen understanding without waiting for instructor availability
Trends
Shift from prescriptive, step-by-step curricula to principle-based learning that emphasizes critical thinking and application over memorizationGrowing recognition that AI literacy requires understanding 'why' systems work, not just 'how' to use them, to prevent widespread misinformationEmergence of competency-based assessment models (workshops, real-world problem-solving, whiteboard exercises) replacing traditional exams in tech educationAI as learning accelerator: using LLMs to rapidly prioritize use cases, prototype solutions, and explore concepts without expensive trial-and-error cyclesWidening stratification in AI knowledge: those who understand AI rigorously vs. those who appear to but don't, creating societal risk from confident misinformationIntegration of AI tools into curriculum design itself—educators using AI to analyze research, understand learning outcomes, and design better assessmentsWorkforce education shifting from theoretical knowledge to practical skill-building with emphasis on collaboration, critical thinking, and adaptabilityReframing of educator role: from content delivery to learning facilitator who teaches students how to leverage AI as a thinking partner
Topics
AI in Education TransformationCritical Thinking Skills DevelopmentAssessment Methods Beyond Traditional TestingPractical Application-Based LearningAI Literacy and Misinformation PreventionCurriculum Design for AI EraTheory vs. Engineering in EducationOrganizational Innovation and Change ManagementWorkforce Development and ReskillingAI as Learning Accelerator ToolCompetency-Based Assessment ModelsDeterminants of Successful Innovation FrameworkDeep Fakes and AI-Generated Content RisksStratification of Knowledge in SocietyEducator Role Evolution
Companies
Wake Forest University
Dr. Taglienti serves as academic director focusing on AI strategy and innovation curriculum
Northeastern University
Dr. Darren teaches courses on AI topics and cybersecurity at the graduate level
North DC University
Dr. Taglienti initially began teaching at the graduate level and continues teaching there
National University of Singapore
Dr. Taglienti teaches AI and related topics at this institution
Asian Institute of Technology
Dr. Taglienti teaches AI courses at this institution
People
Dr. Karm Taglienti
Returning guest; educator with doctorate in education; academic director at Wake Forest; expert on AI strategy, innov...
Dr. Darren
Host of Embracing Digital Transformation; Chief Enterprise Architect; educator and author focused on digital transfor...
Richard Sutton
Referenced as leading expert in reinforcement learning algorithms that Dr. Taglienti admires but unlikely to collabor...
Quotes
"Instead of trying to keep it at bay, we embrace it, not meaning adopt it fully, but we embrace the understanding of it so that I can now manipulate it and use it for what I want to use it for."
Dr. Darren•Early in episode
"It's a bit of a cyclical problem domain. AI can help us realize the disruptive effect, but also help to address it."
Dr. Karm Taglienti•Mid-episode
"True learning means being able to not just explain the theory, but also to understand how to apply it."
Dr. Karm Taglienti•Later in episode
"I think there's a lot of fakers out there now. We have to do a better job of people truly understanding or misperceptions."
Dr. Karm Taglienti•Mid-episode
"AI is a magnifier. If someone was out there already pretending, AI is going to help them pretend better."
Dr. Darren•Later in episode
Full Transcript
We're also going to look at it from the perspective of how can I use these techniques to help to address the problem that you've created? So it's a bit of a cyclical problem domain. Yeah, this reminds me of the name of my show, Embracing Digital Transformation, right? Because instead of trying to keep it at bay, we embrace it, not meaning adopt it fully, but we embrace the understanding of it so that I can now manipulate it and use it for what I want to use it for. Welcome to Embracing Digital Transformation, where we explore how people, process, policy, and technology drive effective change. This is Dr. Darren, Chief Enterprise Architect, educator, author, and most importantly, your host. In this episode, Transforming Learning, the Role of AI in Education, with returning guest, Dr. Karm Taglienti. All right, Karm, welcome to the show. I am super happy to be here. Thanks, Darren. Looking forward to the conversation. Yeah, you know, we talk every Friday. Well, most every Fridays, except when I'm traveling, which lately has been a lot, and you've been traveling. Indeed. But you're returning guests on the show. This is, what, your second or third time? Second, I believe. But, yeah, it kind of feels like more because we talk all the time. But I'd love to share our thoughts with your audience, for sure. Yeah. So, Carm, give a little bit more of your background. I know you've talked about your background before, but in respect to education, let's talk about your background with respect to education, because that fits in really well with what we want to talk about today. Sure. Yeah, no, absolutely. And it's kind of interesting because I grew up more of a classical technologist, undergraduate in computer science. Then I went into systems engineering, very focused on creating computer systems and programming and those kinds of things. But then a little bit later in my life, I decided I wanted to be an educator. So I started teaching at the graduate level in university, North DC University, where I initially did some teaching and I still teach there today, a couple other universities. but I decided to go and get a doctorate in education, believe it or not. So I got my doctorate in education and it was very rewarding for me. And I focused on things like how people learn within the organizational setting as it relates to disruptive features of things like technology. So it's a really fascinating area. And I found now that it's like, that was like the perfect dissertation or research area. And in addition to that, I also, I'm an academic director at Wake Forest University, where I focus on a curriculum around AI strategy and innovation. And you are graciously one of my board members, and I'm super excited to have you on my board there as well. And certainly, I told you, I teach at Northeastern around AI topics and cybersecurity, as well as National Singapore University and the Asian Institute of Technology. So I don't do a lot in the education space. I'm just kidding. But it's definitely a passion of mine and certainly an area that I just love to teach and I love to learn about how people learn. And that's hopefully, well, I'm sure we'll talk about it, but I know it's one of your passions too. It's fascinating because you got your dissertation right before all this kind of unpacked, right? This whole generative AI, which is completely changing education. Anyone that doesn't think so is living under a rock. That's right. That's right. So how did that prepare you for what you're seeing happen? And did you ever think anything like what has happened would happen when you were doing your research and things like that? Because there's been other times where we've had some technology introductions that has been disruptive. But have you seen anything this disruptive or is this just par for the course? Well, you know, it's a great question because what really prompted me to pick my research topic area at the time was, you know, I'd been in the industry for 20 years-ish before I decided I would do this research. And what really bothered me was the fact that many organizations would look at technology and try to change their business. I'll call it a disruption or innovation. And they would fail. And I really wanted to know, like, why was this a failure? It wasn't the technology's fault. And it wasn't the individual's fault. But it was sort of a combination. It was like the, it was the organizational element in addition to maybe some missed expectations with respect to the technology, et cetera, et cetera. And so I came up with this framework called the determinants of, determinants of successful innovation. Like what are the factors that impact that? Now, it's not a recipe that says that if you have three of these and two of those, then you're going to succeed. It's more of the, what are those influencers? And I think at that time, I was really just trying to look at it more from an, I'll call it an engineering perspective. Like there must be some kind of strategy to associate that with successful innovation or successful transformation. And that's what the research was all about. Now, you know, fast forward now into the world of AI. Well, all of a sudden now this big disruption happens. The same elements exist. And I think to answer your question, this probably is the biggest kind of disruption I've seen, at least in my career. Now, the cloud was pretty big and the introduction of the phone's pretty big. But this one was just at a much larger scale. But I also think, and maybe we could talk a little bit about this, is I think that some of the ways in which we can address the disruption as part of successful innovation is by leveraging the capability, which I think is another interesting thing. So instead of just saying, hey, you caused this big problem for me, we're also going to look at it from the perspective of how can I use these techniques to help to address the problem that you've created? So it's a bit of a, you know, cyclical problem domain. Yeah, this reminds me of the name of my show, Embracing Digital Transformation, right? Because instead of trying to keep it at bay, we embrace it, not meaning adopt it fully, but we embrace the understanding of it so that I can now manipulate it and use it for what I want to use it for. you know and we're in the middle of this right at this point so i wouldn't say i've got an answer but i i certainly use ai techniques with some of my dissertation materials or my research results and i i use that to try to help me understand how can organizations succeed at ai transformation as part of going through this. And it really allows us to move a little bit faster, I think in some ways, because of the fact that there are so many complex moving parts. And last time I checked, I can't keep like 37 things in my brain at one time. So it really helps to keep us honest and it helps us to do things like, I'll just give you one good example of something that we use. It's a tool that we use to be able to understand And which use cases within an organization are the best ones that could provide for them highest return on investment or at least meet their expectations. Kind of what I was talking about. And you can approximate that pretty easily because if you, if you set a context and you identify maybe what some of the measurable areas of success or criteria might be. An AI system using sort of standard LLM technology can help you to create prioritized lists or give you not the answer but at least a relative assessment of these are good things that you can do or these are realistic things that can provide value to you as an organization. And so we can move so much faster than we used to be able to, because it was sort of the, let's do three prototypes in the, historically, that's where we used to do it. Let's do three prototype, see if it works. And it's like, oh no, it didn't work. We just blew, you know, whatever, $200,000. Let's go back to the drawing table or the drawing board. And so this is a, I think in that way, AI can help us to realize sort of the disruptive effect, but also help to address it. So it's pretty cool, I think, in the grand scheme. Yeah, I agree with you here. I think it's It can act as an augmenter or an accelerator to solve problems that it causes. Right. It's its own medicine. I don't know if that's the right metaphor or not. Now, what is this doing to education specifically? Because I know my approach to this and what I've been thinking about doing on this. but let's hear it from someone other than Darren. Yeah. Yeah. No, it's, um, you're going to get the same answer. No, I'm kidding. We, we have talked about it, but I, yeah, no, I agree with your approach and I love what you're doing. And, uh, I think it, um, certain area elements I would, um, totally agree with. And I, and I take a few maybe different ones so I can get into that in a second. But before we even get there, the, the one thing that I will mention to tie all of this together is that in order for us to be able to truly transform, we have to make sure that we understand and educate our communities. So I was talking more in the business context, but there is an element of this that is, I'll call it education as it relates to the workforce. Not as much classical education as we think about in the academic context, like university level education, et cetera, et cetera. So we may draw the distinction between the two, but I think they're interrelated. So I don't think They're completely separate as we think about, you know, education. But to that point, let me get into sort of classical education. And you can almost see the inferences into the way that we learned within the corporate world. And we could talk a little bit about that, too. But in a classical education setting that you and I are exposed to all the time, it's, you know, we have to remove ourselves from the standard testing regimens or pedagogy, if you want to call it that, as it relates to how to how have people learned historically. and it's so easy for many of us to just fall into the same ruts again i'm designing three different courses and curriculum right now and it's always the same it's like where's your quiz where's your essay you're going to do a midterm and a final and i was like whoa whoa whoa it's within this new world those are not really good ways to truly understand learning so this is where we can now take advantage of the disruptive effect of AI and think a little bit more about we have the liberty to be able to move faster or take the next step in terms of critical thinking or truly defining what we mean by understanding. And so the definition of understanding then becomes one of not only do you know a technique or have been taught a theory or a concept, but can you apply it? And that's really where we're going with this. So some of the things I know that you're doing and some of the things that I do are more related to how can I apply what I've learned in a real world setting. And that's how I think we need to transform the educational process. So whether it's using, like I'm trying to do this more effectively, but and I'm trying to do it in the current curriculum. And I think I've come close to figuring out how to do this, but I'm teaching students about a particular type of, say, AI infrastructure architectural design technique. And then I'm going to do what I like to call like a workshop style, call it a speed Kaggle competition or a coding competition where it's like, here's your problem. You have a half hour to solve it. So do you know how to approach the problem? and can you effectively describe to me how you would approach it? It's almost like, and I'm trying to do this more from the walk me through like a whiteboard, like an individual exercise. And sometimes, you know, we would call these panels, you know, where we would interview an individual. But instead of actually writing a paper for me, I want you to tell me your thought process and tell me how you apply the concepts in the real world situation, because that to me indicates true learning, true understanding. So that's one way to do it. Another way is to, you know, and we can probably get away with this for a little while is maybe make recordings of, you know, create a slide deck, explain to me the concepts. I will listen to you describe what you're trying to do and what it means to you. And if you can't explain it very well, that's another indication of the fact that you probably don't really know it if you can't explain it. And if you, you know, will people read stuff? Maybe. Will people, you know, they're probably going to use avatars yet, but you get the idea. So I think. Yeah, they could use deep fakes or things like that to do the recordings. Right, right. So, you know, we're trying to figure out how to do that. you would use this technique to do assessments to see if they have learned the material that you're trying to get because ultimately we're trying to instill some skills on these students right we're trying to get them to either their their what what was the term i heard the other day hardened skills core skills like critical thinking collaboration uh there's some i i was i was at a csu board meeting uh this last week uh the workforce acceleration board which i'm part of and i don't consider myself an educator because i've only been teaching a year and i didn't take any courses on how to teach none zero right and so i'm a technologist So these guys are throwing around words I've never heard of before. And I'm thinking, okay, I don't get it. I don't know what you're saying. But you would know what they're saying because you're educated in that. But the approach that I'm taking, like you mentioned, is very different. It's my end goal is do my students understand what I'm trying to get across to them? And I've put in there what my objectives are for the semester. I want my students to understand these key principles and be able to apply them in the real world. And that's it. So if they put in the effort, then I should put in the effort too, I guess. I don't know what the right answer is. Yeah. No, I believe that that is the right answer. And I think it stems from, well, there are two different ways to look at it, I think. There's the theory-based and then there's the, we'll call it theory and maybe engineering. So you need to look at it from the perspective, like, if I were a theoretical mathematician, then I'm trying to actually explore different ways to think about mathematical principles and representations of the real world as it relates to sort of math and mathematical proofs and theorems and axioms, etc. So I can get into maybe theory and then I may not I maybe imaginary numbers for example you know that I can represent maybe a phenomenon with an imaginary number I don have to necessarily prove it because I not that not what I interested in but an engineer is something slightly different An engineer I looking at it from the perspective of, I have to make practical principle-based decisions in the real world or even physicists, but let's go engineering because maybe that's more near and dear to our heart. But in, in, in those two realms, I think is a slightly different, you know, learning is slightly different in each one of those. So like if, for example, I'm in the mathematical world or the academic world or maybe theoretical world, I use proofs to be able to demonstrate understanding. And if I can prove something, then therefore I have, you know, I've learned that particular topic area or at least been able to talk about it in a way that other people can understand. If I'm an engineer, then I can do the same thing, but it's a principled approach. So did I apply the principles in a way that's sound and applicable in the real world, that's another way to demonstrate understanding. So I think in our, we meet, you know, I personally err on that side more of practical experience, which is also called more engineering focused. And I think what you're doing is exactly that. It's like, we don't necessarily, and this has been an argument I've had for a long time, like when somebody wants to learn about AI, it's like, I don't need to teach you about, linear algebra and calculus and statistical derivations and probability at a real theoretical level, I'm just going to show you techniques that you can use in order to be able to solve a particular problem, which algorithms work, why they work, how you apply them, what the expected result would be. I don't have to know all the theory to do that. It's really nice to know if you're, you know, whatever geeky like us, but it's not something that's required. So I think that that's a sort of draw, draw that distinction between, um, approximation. I call it that maybe approximation is not a good word, but it's sort of the, I just need to know enough to be able to use these skills. And I, I have to have enough confidence to know that when I do apply them, I'm going to get a result that makes sense. And for our students, just like you said, because you had 100% hit it on the head was, I want to make sure that you have these practical skills that you can apply. I'm setting the expectation for you that will make you successful. But I also, you need, you student need to step up to the plate and demonstrate to me that you have these skills before I will give you a passing grade in my course. And so I think that that's, that's exactly what I think we need to do for most of the population. Cause I, even I, you know, I like to consider myself a bit of an academic, maybe not as deep as maybe most academics, but in the grand scheme of things, I won't be writing my own, you know, neural network algorithms to be able to maybe, or I won't be working with Richard Sutton on the next version of, you know, the reinforcement learning algorithms. I'd love to do that, but I'll never do that. But I understand them, and that's about as good as I'm going to get. But I can use that to help explain to other people why these are the right techniques. So it's a long answer, but I think you're spot on. Now, do you think if we only take this engineering approach, which we won't, but because of generative AI, do you think some of the more theoretical academia is going to start falling aside because of the fear that we may have in education to move away from theoretical to more practical? because that's how I can actually create some value in society. Because we're going to relinquish some of the deeper thinking or theoretical stuff to an AI. Because an AI can't do real things, not yet. Right. Where a human can produce real things. do you think there's a potential there to lose that kind of academic rigor i i don't think so and here's why because i think if you look at um you know let's just take Maybe, you know, statistical measure or percentages, you know, how much of the current academic community before AI appeared were true theoreticians or not, not many. Right. True academics. Or even, I mean, even look at it from the perspective of how many people have PhDs. Right. I mean, it's it's like some crazy small amount. So I think having the expectation that, you know, the, a large part of the population will say in the case of AI, really understand AI and how it works and, oh, what's not a convolutional neural network work. And it's like, you'll never need to know that. So it's maybe crazy for us to believe that that's true. But many, many, many people, you know, and maybe there's a missing middle in there somewhere. There are the engineers that understand true application and why things work, not so much how they work, but why they work. And then I think there's also that missing middle, which is sort of what we're trying to address. It's like you don't have to be the expert engineer either or the, you know, call it the execution or execute the person that can execute on these concepts. But you should at least have a decent understanding of why things work and with enough confidence to know that these are founded in rigorous principles and solid, I'll call it intellectual or academic rigor. Who's doing the rigor is probably a smaller part of the population. It's been that way for a long time. Like, you know, like, why did I always hate math when I was growing up? I love math now. But why did I hate it? Because it's like, it's so hard. And not so many people did it. But, you know, you get the idea. But it's, no, I think it's, I think we'll find out at the end of the day. There's probably sort of the standard stratification, if you will, in terms of the population, in terms of like, who knows things at the theoretical level and who knows things that are the deep engineering level and who knows things that sort of the someplace in between. I mean, but I think the real problem we're having today, and we could talk about this too, is I think there's a lot of fakers out there now. I really think we have to do a better job of people truly understanding or misperceptions. Maybe I won't call it fakers, but I mean. No, they're AI fakers. No, I know. Sometimes I go to listen to a talk and people are trying to tell me how AI works, or not tell me personally, but the group. And I was like, that's not right. That's not right either. No, no, no, that's not right either. And I was like, do I stop them and correct them or do I just sort of let it go? And it's, I guess, because I don't want to disrupt the whole thing. I let it go. But it's like, oh, my God, we have to, we really do need to learn more about how things actually do work. And this is an area that could possibly be, you know, I wouldn't say harmful, but certainly, well, me, I will say harmful, harmful to our society. Because I think in a lot of ways, misperception and misunderstanding could equate to, you know, problems for us. And, you know, like deep fakes or, you know, whatever, you know, things that could be malicious. It's interesting that you brought that up, that there's a lot, because there's a lot of people out there talking about AI. That don't know what they're talking about. That's right. That's right. Yeah. Right. Well, because it's a big buzzword. Do you think that AI is, I think AI is a magnifier, right? So if someone was out there already pretending, AI is going to help them pretend better. If someone is out there that's highly theoretical, they're going to be able to become even more theoretical than they were before. Do you see the stratification widening because of that? meaning the gaps between them, that it actually may cause some harm to our society as a whole in segregation of thought that AI will magnify those segregations of thought Yeah. Am I making any sense? You are. This is just like spewing out of my head. No. Yeah, no, I completely agree with you because I do think there'll be a widening gap at some point where it's going to be those that know and those that don't. and um but i think the harmful element comes into those that don't say that they do and i and i think it becomes a um you could you can say it with confidence because the rest of the population doesn't know like if i were to tell you if i were to tell you something about whatever the i don't know the surface of uh venus and you didn't know anything about it and I did it confidently, you'd be like, sure, Carm, whatever you say. You know, so, and, you know, I think that we have that element, unfortunately. So people would be like, that sounds plausible in the AI space. And I'll believe you, but it's, I think we just need to do a better job in terms of making sure that we understand, you know, what's real, what's not, or how to ensure that what's real is. And this gets all the way back to your critical thinking that you were talking about earlier. Like I think as a society, if we become better, and this also maybe becomes our charter as educators, if we are saying, hey, question the obvious or question the unknown. And if you're able to, if you have a good critical thinking strategy, then you'll ask the right questions or you'll try to put it in terms of contextualization of things you already know to say, that doesn't make sense to me or that does. And they should have, you know, for students that have come through our classes, they've already been through that. It's like, I made you, quote unquote, go through the process of critical thinking to demonstrate to me that you understood the concepts I gave you. So therefore, please do apply that in your professional life and in your personal lives. And then I think we just increase our, you know, an educated society and not avoid that problem because it will exist, but maybe we can quell it a little bit and sort of take the next step as a society. Now I'm getting all, you know, universalist here and talking about the crazy future. But I do think it's, I think it's a good opportunity for the human race generally, you know, and again, I'm not saying it. Oh, I think so too. I think so too. It's, we as educators though, have to change the narrative. Yeah, that's right. That's right. When I first started teaching, I took over someone's class who was a brilliant educator. I mean, he's written books. He's, I mean, incredible. And his course was set up for not AI usage. So his course was, if you do these things, you're going to learn these principles. If you jump, if you do step by step by step, right? So it was very prescriptive, right? and what i found uh towards the end of that semester that i taught that class was um the kids could cheat they could game it because it was so prescriptive that nai could do it and so i had to go back to the drawing board and say wait my my goal is to teach in these specific principles and critical thinking is one of those skills that they need and there was no room in the curriculum for critical thinking. Does that make sense? Instead, it was all about teaching principles, but no critical thought was going into, well, why did I use dependency inversion? Instead, it was, this is what dependency inversion looks like, and this is what it is, and this is how to use it, instead of, why would I use it? And what are the trade-offs of using it? So I had to completely rewrite the courses that I'm teaching because I didn't feel good about the end goal. I didn't feel like it was giving them the best opportunity to understand the principles and to build up that strong critical thinking muscle that I want them to have. Right. Yeah, and I love that. And I also think that, and this is something where, like when I was younger and I was in, you know, I didn't come from a very educated upbringing. So, you know, I grew up outside of Buffalo, a blue collar family. You know, an education was just sort of the, oh, you're the one that gets to go to college. But so I didn't really know how to learn. And I remember getting into college and I would literally have a dictionary right beside me because when I didn't know something, I had to look it up. And that was how I was going to sort of make up for the fact that I didn't have an amazingly good, you know, I'm not I'm not criticizing my education. I'm just saying it was not as good as maybe others have had, but whatever. And, you know, woe is me. I'm just teasing. Right. But it's, but I think now to your point, so if I can teach you what critical thinking is, and then in order to be able to be a good critical thinker, how can you answer? And I have a two and a half year old grandson now asks why all the time, but now AI can help with the why, right? So if I think about it, it's like, why, why, why, why, why? It's like, if you're, if you are a good critical thinker and you're trying to understand a concept and you're trying to really understand the why. And, you know, your professor's not around, it's not in your book, or you'd have to read a whole nother book to get there. Well, I have a great vehicle now to be able to learn these concepts, whatever level I need to, in order to ensure that I can, you know, get to the point where I can understand the concepts so I can continue my learning process. And that is amazing. That is like game changing across the board. So that I think is another thing we have to teach students is sort of the, how do I leverage the assets that are available to me today to become a better learner so that I can continue the process? And to your point, we can't just say, if you pass these three tests, you get a degree. We can't do that anymore. I think we have to be able, it's like you have all the tools, humanity, go figure out how to truly learn something and use the definition I gave it for learning means being able to not just explain the theory, but also to understand how to apply it. That's real understanding. That is what we need to help with. And if you can't do it on your own, you'll let AI help you as a co-pilot or as a, you know, whatever, an advisor. That, that I think is really the onus is on us to get there. Well, Carm, as I always have fun talking to you, but we're out of time for the show. That's great. But we could go, we should do one of those long forms. We could fill up an hour or two. Yeah, we could fill up a couple hours. But I think our audience may fall asleep if we do that. I think you're probably right. Right. But Karm, you'll be coming back on the show. We're going to probably have you on monthly to see how things are progressing. Yeah, great topic. We're in the middle of this. So this is an exciting time. So thanks for coming on the show. Well, thank you so much. I thank you to your audience for listening to us. Hopefully we didn't put anybody to sleep, but very, very fun, very exciting. And thank you so much for what you do, John. Hey, thanks, Carm. Thanks for listening to Embracing Digital Transformation. If you enjoyed today's conversation, give us five stars on your favorite podcasting app or on YouTube. It really helps others discover the show. If you want to go deeper, join our exclusive community at patreon.com slash embracing digital. where we share bonus content and you can always connect with other change makers like yourself. You can always find more resources at embracingdigital.org. Until next time, keep embracing the digital transformation.