This is Scott Becker with the Becker Business and the Becker Private Equity podcast. We're thrilled today to be joined by a brilliant leader in artificial intelligence. In fact, we've selected Abed Bodla as our AI leader of the month at Becker Business and Becker Private Equity. He's going to talk about his company, Novescent, a technology consulting firm that focuses on mid-market enterprise companies, but does a remarkable job of trying to fill the gap from idea to actually implementation of enterprise-level AI solutions, just a brilliant, brilliant leader. Abed, can you take a moment to introduce yourself and tell us a little bit about Novescent? I sure. Thank you, Scott, for inviting me. I'm the founder and CEO of Novescent, where's a technology consulting firm where we help enterprise companies, think companies into healthcare industry, manufacturing, home builders, move from experimenting with AI to actually running AI in production on the Microsoft platform. I will focus mostly on Microsoft. A little bit about my background that I've spent about 20-plus years working with Fortune 500 companies and enterprise technologies. I've seen that a lot of technology waves come and go. What's different if you focus purely on AI is that in AI, the gap between a compelling demo and a system that actually works in a real world is enormous. The gap is exactly where we operate. In Novescent, in our AI projects that we are doing, right now, we are in the middle of a very exciting AI engagement. This is with a company that's one of the major medical image twice manufacturing company, where we are doing a very transformational project with their customer servers, engineer training department. We are using Microsoft AI Foundry and HNTKI to implement a process that fundamentally change how their customer servers engineer access knowledge and how their training department develop and train a training material for their customer engineers. Beyond consulting work, just last bit here is that I have also built a YCI SaaS solution, which has helped me understand, actually, learn a lot about how a product can be developed, especially for AI. And especially by implementing this is one thing to advise client on AI strategy. And it's another to actually build and deploy in AI product to yourself. So I've gone through that experience of building a product and we're not deploying it out in the real world. Which makes a huge difference. When your consultant has actually done what you sell or done what you work with clients on and they've actually done it, it gives you a whole different level of understanding of it that's almost impossible to replicate quite frankly. You mentioned about this concept of this gap between sort of the demo to an actual solution, an actual practical solution that somebody's using. We often think of that gap between sort of the one-off uses of AI versus enterprise solutions. Talk a little bit about how you help people bridge that gap from sort of demo or idea to actual useful enterprise solution. Talk about what that looks like. And we know that's so important, but how is that actually done and how do you work with clients on that? So demo sometimes usually just picks up a one scenario, right? It doesn't look into overall framework, the good-witness, security. That doesn't come into a play. And that scenario usually is very easy to implement because you have a very controlled environment, a use case, networks perfectly fine. But when you are building into a real-time solution, then you have to think about how it will scale. What are the God rail you have to put into place from security point of view? How will you get the data that is needed for the solution? So for this entirely flow or the project that we are developing for this particular client for AI solution, we have to put in the data from different environments. They use service manuals. There's a video transcript sitting in different environment and SME knowledge into different email systems, leather drives. So building this up, creating a rag layer that agent understands, and then making sure that agent doesn't hallucinate. So there are a lot of things you have to do to make it production ready. That takes time. That needs commitment from executives. And then you test it out, make sure it works again and again. And eventually it works. But there's a lot of commitment and drive there. You work a lot with mid-market and enterprise companies that are sort of moving to fill this gap, to get to real productive uses of AI. What do their teams have to look like to do this well? Is there a certain amount of staff and intelligence and commitment they have to make to it to work with a consultant and make this work really well? Yes, of course. We have seen is that the people because in this particular project that we're working on right now, everyone is busy into different projects they are doing over the year. Teams have already reduced in multiple departments that I've seen the clients we have worked with. So that commitment and buying from the executive levels at the highest level is very important. They are giving the time to the team they need to work on. And then there's definitely need to be a status update and oversight. Oversight, I would say on the project that it is progressing well. And if their bottlenecks are coming, that is taking the priority. And sometimes we have often offered to help. Like for example, this whole Azure infrastructure setup that we have to do for this particular project, the client team is busy. So though it was not part of our scope to start with, but we have offered to help that, okay, we're going to initially at least set it up for you guys and help you while you are busy so that the bottleneck can be avoided. It's just to making sure that it just doesn't drags down and everything is on the top. Executives are monitoring it. Status update is happening. And what is an end goal? It's very clear and communicated well for the project. Thank you. I love that, that clarity about the end goal, clarity about where we want to be. When people are doing AI integrations with sort of cloud and enterprise solutions, where do companies underestimate the complexity of what they're trying to do? And how do they get through that complexity to make sure you get to fill this gap from idea or demo to actual performance? Where do people underestimate the complexity of what they have to do and then fill that gap? How do they end up making that work? So there are two things to it, right? Where the companies stand right now? Are they migrating their stuff over to the cloud as part of this AI exercise or whatever or part of a digital transformation? They're moving to a cloud. So the cloud migration sometime in data readiness to move to the cloud could become an issue, right? There's sometime they underestimate the cost of the integration because in the beginning it looks like it's a simple lift and shift. But these and for the enterprise customers, they build solutions over years right up there. And it's when this actually starts to look at the integration, you know, it starts becoming clear that it is more complex. The workload that have been built over decades, business logic buried in on-premise system, custom integration, hard-coded processes, undocumented dependencies, you know, that someone has built something or that engine is no longer there. So integration definitely becomes a challenge and access to the data. And the data and its access is most important when we talk about AI because that's the foundation layer when we're talking about the RAG layer. Now I'll get back to the same example I gave earlier, right, that for this particular project that we're doing, IntelliFlow project for this medical imaging device manufacturing company to build a RAG layer, especially for the service training department, it's the same thing. The servicemen was at a different location, videos are getting stored in a different system and they're transcript and their SME knowledge is sometime it's an article, sometime it's an email. And how do you combine all this together? How this knowledge becomes meaningful to AI to generate the content that they're looking for is definitely become a challenging environment to deal with. But you know, the honest answer is your integration that is always worse than you think. So the AI projects have a way of exposing it fast because AI agent needs to look at that knowledge and then it has to generate the content that you're looking for. So integration challenges definitely are going to be there, that needs to be resolved. And going forward, as AI integrates more into the workforce and into what companies are doing and delivering, at mid-sized firms, mid-sized companies, what will that mean for the engineering teams, for the software teams, for people traditionally in a lot of these roles? When we're seeing it sort of the mega companies, Amazon, Meta platforms, looking at some layoffs, some of that they say is due to AI, some of it of course is due to them just trying to be a little bit more profitable so the markets respond to them better. But what will this mean for a typical engineering or team, a technology team at a company, a mid-sized company in terms of the AI impact and what impact this might have on a mid-sized company's team and group? So it's a conflicting, even if you look at media, what people are saying and the podcast, what other people are saying, for some people it does say that it's going to change and reduce the jobs very quickly. But when you look into some other people and it is in the reality as well that it's not going to happen overnight, even if it happens. So it's not the few people, it's a different leverage, I would say. The best engineers today are the ones who can direct AI, not just write code. And the shift I'm seeing is for individual contributors who can execute engineers and who orchestrate. So you need people who understand how to design a jointing workflows, how to evaluate AI outputs and how to build systems where AI and human handoff to each other intelligently. So that's a fundamentally different scale profile that most engineers teams were hired for is different than what they were hired for. But that this has happened over the past, during or a technology advances as well that you have to retool your team. And so for leaders, my practical advice is this, that don't restructure your team shed. Upskill first, let bring them up to AI skill, let them take advantage of vibe coding, we call it. Identify the engineers who are, and then you identify your engineers who are naturally curious about AI, give them real projects and let them build the muscle of your organization, then redesign your organization around what you have learned. And the companies that are getting this right aren't the ones who replace their teams. They're the one who retools them because that your team has all, especially the companies who have built system over years in the organization, the teams have all the knowledge. So they definitely have to retool them, they have to train them. They may, I would say they may not need their partners that much, over time, deep may reduce, but right now I feel like that you still need your team to build a solution and build efficiency and even build new way of doing work that can contribute to profitability of the organization. I think that's so right and so important. At the end of the day, it's going to be people plus technology, people plus AI. And at the end of the day, to make it really work and customize for your group, for your team, for your company, you're going to need internal people to go with your external resources like Novicent to make this all work. I mean, is that a fair statement? You're going to need sort of both, aren't you? Yes, I agree. And whenever there's such need come up, you definitely have to decide between whether you want to build something, you want to buy off the shelf something, or you partner with someone. So the framework that I use comes down to three questions, like whatever you're trying to do, is this core to your competitive advantage? Do you have the talent to sustain it? And how fast do you need to move? So if the answer to all three points away from building, then you partner with someone. And in the AI world right now, that almost always means partnering around a platform, not building foundational AI from scratch, except on very special circumstances. The real decision is which platform ecosystem to bet on and how deeply you want to integrate it. And what I tell clients is that buy the commodity, partner on the platform and build only what makes you unique. So for most of our clients, on the Microsoft stack, that means that use Microsoft, iFoundry and Co-Pilot, and handle the heavy lifting and we build workflows and get some partner in the beginning. Like, and if I go back to my previous question where I said, you know, train your team internally on AI, sometime it takes time. So, and I do agree that if you want to, you know, speed thing up, you definitely need to look out for partner who have skills in that particular area, and can can make sure your POC is successful, can make sure that when you're building that ground framework, a partner who has done these things multiple time, who has worked on different industries, have a more broader experience of running into issues and how to resolve them, can definitely help build a, you know, a sound foundation or a framework to build an AI solution. And you've got this great experience of having worked both with and for Fortune 500 companies and also been built your own stack, your own stuff that you work off of, which is which is remarkable. And you talk about sort of buy what's the commodity and then put on top of it, the stuff that really takes domain expertise and true partnership expertise. Is that a fair statement? Yes, I do agree. Yes, I do. And we talked about this a little bit earlier. When you come into companies to work with companies that are really trying to fill that gap to AI enterprise solutions, when you look at sort of technical due diligence, what do people have to do to be prepared to do that? You mentioned some people, you have to do a lot of work to get their data in the right order. What are some of the things you see that companies should think about when they're getting ready to really deploy AI, more enterprise type solutions, bigger solutions? The biggest thing, of course, is, you know, when you bring in a solution, like I mentioned earlier, is integration, which sometimes goes into a technical debt. Because AI needs to communicate with the system, right? You have to see that is your organizational systems are up to par, right? Because if you are built, if you have organizational systems that are not open, not easily accessible, then you go into, you know, making those system accessible, making so that AI can understand, communicate with that system. And to some extent, it goes back to, you know, building a middleware, building an integration framework to communicate with those systems. So that's what I would say, right, you know, that your systems are ready to communicate with AI is definitely very important. And then, you know, building a platform that you have, so either you go with a, for us, it's always Microsoft framework or Microsoft Foundry, which is the enterprise rag that we use. And then use co-pilot and other systems to integrate with those systems to, you know, build it out. Fantastic. And talk for a second. When you look at sort of the rest of this year, where are you most focused and excited as we, as we work through the rest of 2026? What are you most focused on and excited about? So I think for us who work with big to large enterprises, I think the, I'm excited about is my focus has been over the past year, past many years has been, you know, custom building custom workflow or business processes for the clients. And I'm seeing more and more clients are now bringing in that AI flavor to it. How do they improve those processes and add in through it by using AI. And this particular project we are doing for client for this generative AI for, for training material, for, for knowledge access. And this is more and more happening with more organizations where they see a value of it. They see that how it helps their current work staff do more and do quickly. And I will just, I think I'm going to see more and more such projects happening this year. And I'm very excited about it. Making the systems go live, go into production is very exciting for this year. And, and, and Abed, let me ask you this question. What advice would you give to sort of a mid-sized company, mid-market company that's trying to deploy AI at scale, that's trying to really make a difference and, and, and, and make that impact that AI can have. What advice would you give to companies trying to do that? Are there overriding pieces of advice that you would, you know, approach a company with? So when we talk about scale, of course, you have to select a platform. And for us, it's Microsoft, it's scalable, their Microsoft Foundry is scalable. Same is true for if you're working on Amazon, you just have to select the platform that's scalable. You have to select a platform where your data already reside. In our case, most of our clients, some of the systems already in Microsoft, they're already into Microsoft cloud, Azure cloud. So selecting a right platform is one thing which that is scalable. I'm talking about the AI platform. Another is working with a partner who have done a solution at that scale and have experience of doing those solutions. And the third, of course, is to making sure that those companies see the value of it, right? And they have a enterprise or executive focus and executive buying to make sure that those projects get done right. And priority is given to them. Last year, if I look in my experience that we have done, last year was more into a POC and and companies didn't was not much focused on bringing those POC to production, but this year seemed to be exchanging. So they should just stay focused on it. Challenge will come in any new technology, new product. So they should just focus, just need to stay focused, work with those companies who have experience of building solutions, just to rephrase it, build a platform, select a platform that is scalable and can handle multiple models like Microsoft Foundry can handle 11,000 plus different models you can use into Microsoft Foundry, whether it's an open AI, a cloud or any other model that can be used. So focus on three, four of those things. And then it has to be incremental. You have to make something go live, which is Majiripal. That's a value. So build on a success and then scale out of it to throw out the organization. Again, Abed, what a pleasure to visit with you. Abed Bodla, CEO, founder of Novescent, brilliant AI leader of the month. We are so thrilled to get to feature you, Abed. What you're doing is remarkable. The mix of intelligence and practicality is unmatchable. Thank you so much for joining us today on the Becker Business and the Becker Private Equity podcast. Thank you very, very much. Thank you, Scott.