
Experimentation With AI Is Over. Here's What Wins in 2026.
Marketing leaders Courtney Baker (CMO of KnownWell) and the host discuss how AI adoption in marketing has moved beyond experimentation phase into practical implementation. They emphasize focusing on solving real business problems rather than experimenting for experimentation's sake, and stress the importance of integrating AI into existing workflows and meeting cadences.
- AI experimentation phase is over - teams that are still just experimenting are now 1-2 years behind
- Successful AI adoption requires integrating tools into existing workflows and meeting cadences rather than creating separate AI initiatives
- Marketing leaders should focus on outcomes over AI usage - if teams can achieve results without AI, that's acceptable, but AI should be leveraged when results aren't being met
- Custom GPTs and similar tools remain underutilized despite being highly effective for repetitive, thinking-intensive tasks
- Teams need psychological safety and low guardrails to experiment effectively with AI tools
"Experimentation With AI Is Over. Here's What Wins in 2026."
"I would say right now I think I see still a lot of people stuck in experimentation for, like, experimentation sake. And I think there was a season where that was awesome. It was great, and everybody should have been experimenting. But I think we've moved past that phase where your team is going to get tired of just experimenting."
"What I really care about is outcomes, period. The end. I want a marketing director that can produce the results that we've outlined as our goals for the company, for their role for the department. And at the end of the day, the reality is I actually don't care if they use AI."
"If marketing teams, I believe, are in a state of fear, you are not going to produce great marketing. That's my personal belief. And so you have to create an environment where marketing teams can take risk."
"We became so data driven that we couldn't actually, like, have this thing called discernment anymore. It's like the numbers needed to prove interesting. Like, if the numbers could prove everything, nobody. Like, there would be no art in this."
It is 2026, and every marketing team is at least talking about AI at this point. Like they'd have to be, I don't know, in a nonprofit higher ed marketing team to maybe not be talking about AI. I'm trying to think of like the most far behind industry there is, but even those people are talking about AI when it impacts marketing. Right now, if your team isn't already playing with AI, isn't already using their own ChatGPT account, it's one of those things where every marketing leader is thinking about how to implement it. So that's what we're going to be talking about today is how to actually help your team adopt AI wisely. And I'm here with Courtney Baker, who's, I think it's her third, maybe fourth time on the show. I don't know, you've been on here a couple of times, but it's been a while.
0:10
It has been a while.
0:56
Courtney, it's good to have you again. I know you are the CMO of KnownWell, which is a tech company that focuses on the service industry and helping them scale their businesses. But today I'm looking forward to having a conversation about what it takes for marketing leaders to actually equip their teams with AI and doing it well. So let's dive into the first question. It is, what are the most common ways AI adoption goes wrong with marketing teams? Because everybody's tried it, many have failed and say blame it on AI. So I'm wondering what are the most common things that you're seeing across other marketing leaders implementing these kinds of tools out in the space?
0:57
Yeah, I would say right now I think I see still a lot of people stuck in experimentation for, like, experimentation sake. And I think there was a season where that was awesome. It was great, and everybody should have been experimenting. But I think we've moved past that phase where your team is going to get tired of just experimenting. What we need to focus on now is taking problems, documenting what those problems are, and then saying, can we use AI to solve this problem? I would say that's one area. I would say the second area that I see AI adoption fail is not bringing the AI to your work streams. For example, you know, with people I always ask like, hey, what's your meeting cadence? Like, what drives your marketing team forward? And then how are you taking AI and applying could be a lot of different things. It could be AI driven intelligence for your pipelines, for example. How are you bringing those into those meetings? So you're, you're driving the Action the cadence of your operations with AI and instead it stays totally outside of the natural workflows of your marketing team. And so then it's just, you know, remembering to go use those AI tools or you know, like trying to build new individual habits. But when you build it into the fabric of your marketing organization, it's going to make it much easier for your team to adopt AI and for become just like a default that your team uses. So those are two areas where I've seen it go wrong lately.
1:36
It's funny, I think I saw Ethan Mullock post recently of the best practices from like a year ago or two years ago of like, set up your team, give them some tools, give them some time to experiment. Like, people are now doing this. Most companies are now in experimentation phase and we're all looking at it. At least those who are like in it all the time would be like, yeah, year behind on this one.
3:31
Totally.
3:53
Experimentation's not enough. You had you. If you missed that period, you now have to catch up because you are now behind about a year, maybe two years. I don't know. It really depends on which industry you're in.
3:54
And listen, I do get it because some of those experiments failed or were hard. Listen, I remember the first time I was trying to build, build an agent and make, you know, I was like, this is so hard, so hard. This is, you know, and I obviously work with a lot of very smart engineers and developers and they were just doing things that I dream about doing, but you know, it. I think when you get into these experiments and they don't work, you know, you're less inclined to do it again. And so just if you get stuck, stuck in that phase, I think it's always helpful just to go back to the problem, find one big problem that you can solve using AI and I think that's, that can get you back on track.
4:04
Yeah, I remember at the beginning even social media was like experimentation. Right. Remember, like in the early days of social media, people are like, I don't want to post what I ate for lunch. Right? You're like, well, it kind of took us a while to define the best practices. Right. Of what became Standard 101. Content marketing, social media. Posting things that are helpful, posting things that are maybe resonating, posting things that are maybe newsjacking, if you want to start to like, get, follow the trends. Like all these things became standard. They took a while. Some came from other industries, like the news cycle. Others had to just be invented. But like, we are just getting Part of it feel like we're just getting part past the early phase of that phase of AI where we don't have any idea what we're doing. And now we have some like good parts in place. So now I have to ask you, like, what does success look like if they're now doing it? What are the teams that are doing well now doing? How do you know AI has really become part of a team's day to day work.
4:53
Yeah, I think it's really a lot of not having to drive that change with leadership when that is happening organically within your team, when your team starts to come to you with ideas or even, hey, here's this AI platform that I, you know, I want to use, here's the use case, here's the outcomes that I'm looking for versus you having to say, hey guys, let's, let's get this problem, let's get this table, let's drive this action. I think that's when you know, hey, this has shifted. And then I would even go back to what I said earlier with your meetings. When the intelligence that you're using to run your marketing, when you start to have AI involved in those cadences, I think you can start to say like, hey, we have made that shift from experimentation to really utilizing AI to drive our marketing function.
5:54
I feel like one of the big things that we've run into is this custom GPT thing that's kind of been tried and true. Still not a lot of people using it or using the instructions and projects or using cloud projects or using Gemini Gems. All the same thing generally. Yeah, just different branded versions. But I almost feel like I could probably go to a team member and be like, what's the most normal thing you do that is time consuming, like thinking and you have a process for it. Maybe you don't even understand what the process is. But let's figure out how AI can help at least speed up something you do daily or weekly and see what we can do there. And then trying to do that with each specialty of your team, there's probably something there that's better than experimentation because it's at least being coming back to like what they actually do rather than trying to invent new stuff that they don't do, which is usually a waste of time, is what experimentation leads to is it invents new work that is now busier.
6:55
Yeah, it's true. And especially when it's, you know, a busy season and then it doesn't work and then it just feels like a waste of time. I actually said this on our podcast. One of my hot takes was that AI would actually. We would spend more time using AI last year than save time because of that reason. It was so. So many AI tools. We don't have front runners yet for those things. You know, it's. It's exactly what you just said about social media. You know, over time, we kind of figured out what works, what are the plays, how do you do this? Which tools do you use? You just knew, like, this is the best tool. I'm going to use that best tool right now. It's like the Wild west, where it's like, okay, I want to do this thing, but I could do that thing with, like, 12 different options. You know, it's not an obvious this is the best tool to use for xyz. And so we end up spending more time. I do think it's getting a little bit. Do you think it's getting a little bit better?
7:48
Yes, absolutely.
8:51
By the way, on the custom GPT, I'm absolutely with you. I'm still shocked how many people. Some of my favorite things about AI are these custom GPTs. I mean, literally save me so much time. I don't think our CEO would mind me saying this, but, for example, I have a custom GBT in his voice. So simple. Like, it's a simple thing, but it saves both him and me a ton of time when I utilize that. We have one for lead qualification. You know, we want to know more about the lead. We have it built, we drop it in, we know everything we want to know about that lead. Again, very simple things that just took some time to do the thought work. And every single day, produce returns for me. So, yeah, PSA for everybody.
8:53
Yeah. And I always tell people, like, until you've had like a. Until you've built like, a dozen custom GPTs, like, don't worry about make, don't worry about Nan, don't worry about vibe coding anything. It's like. And just to make everybody feel better, if you've tried make or NAN and failed at it and felt like you were dumb, I know developers that are extremely intelligent that say they'd rather just hand code it than use make. Okay? That's how confusing those tools are.
9:48
That does make me feel better. Yes. Thank you for that. I will take that one to heart.
10:14
Oh, I should just be able to drag and drop the menu and then you get the menu options. You're like, fricking. I don't even understand what it's saying.
10:20
Yeah. And it's Especially hard when it's like, it's a platform when you're using make with like something like HubSpot that I feel like pretty good, you know, like a decent user of and I'm like, I'm completely lost here.
10:25
Yeah. It makes you realize how much work platforms like HubSpot and High Level or the marketing automation systems have dumbed it down for us marketers to be like yes or no? Left or right good. I don't know, like they made it simple enough for us to be like go in there and actually make choices versus like now you want a boolean search for. You're like what the heck is he talking about? Yeah, what's the IP address? You're like, there's an IP address for this.
10:39
Yeah, exactly. But you know, I think it's going to come. You know, I actually said this. I was with a company last week. They were working on some AI things that they were doing in the organization and this wasn't necessarily tied to my role at knownwell and we were talking about whether or not they, how much they needed to add about agents in their policy and I was like, you know, right now it's really not. It's like what you said like to the common person it's challenging. I'm glad that we can all establish that transparently. It's not the easiest but it will. It could come tomorrow, Dan. We could have some kind of platform that does make it easy enough for me to just be like pulling the things in like it seems like it should be and the whole ball game changes.
11:05
I think this is my prediction. But platforms like NAN and MAKE will just go away and we'll just vibe code them because we'll be like, hey, when every time this application, here's the password for it or whatever it needs. Every time this happens, I want these things to happen and it'll go and be like okay. And then it'll make, maybe even make some advisements and be like, well what when you get to this point does it want to go left or right? You know, you're like, well left, you know and I think we'll just vibe code the automations.
11:55
I do love doing that. I really listen. With our team, obviously again we have data scientists and you know, our whole engineering team. I like to joke that I am a data scientist because if you really, you know, it's just conditional logic and you know, marketers do that. You know, I'm like, I'm basically a data scientist but I do love when you can code like A calculator that now have my stuff actually ever seen the light of day? No, but I've been able to like hand those things off to developers and they're much further down the road than they would have been pre me being able to do that.
12:21
So I have a question, my next question. I'm going to tweak a little bit and I'm going to ask it a different ways. I think this will be interesting, but you tell me there's two different ways you can answer this. But I'm going to twist it. It's. I originally had. If a marketing. If a marketing leader has 90 days, what's the plan? What should they do in the first two weeks? The middle and the final stretch, as if you were the marketing leader. But I'd like to ask you if, since you're a CMO and if you hired a marketing director who was pretty good with AI, what would your actual expectations be as a marketing leader for where like maybe an AI driven marketing leader could be in 90 days? Like, what would your hopes be? Beginning, middle and end?
13:01
Yeah, it's really interesting. I'm gonna go with the second one.
13:41
Okay.
13:44
I may intertwine the two, but I'm gonna, I'm gonna answer the second one because what I really care about is outcomes, period. The end. I want a marketing director that can produce the results that we've outlined as our goals for the company, for their role for the department. And at the end of the day, the reality is I actually don't care. And I would say this is true for a lot of executives. I actually don't care if they use AI. Now, do I think that they could produce better outcomes utilizing AI? I do think that. But the end of the day, what I really care about is the outcomes that we've set out to achieve. And so one, I think that's always helpful framing. Now, if we're not hitting our outcomes, maybe one of the questions I'm asking is, well, how are you utilizing the resources, the tools, assets at your disposal? Oh, you haven't used AI at all. Okay, well, that. Why haven't we pulled that lever? And so I think that is kind of the series that I'm dissecting against. But at the top level, ultimately, if you can produce stellar results without using AI because you don't need to, it does not matter to me. Again, I don't think that's the world we're necessarily living in today.
13:46
Yeah.
15:19
And then I will actually answer your other question too. With directors, with my team, I Want them to really set an environment that allows experimentation. That. And what I really mean by that, and I actually think this is true for marketing teams at large, is that we have to lower the fear. If marketing teams, I believe, are in a state of fear, you are not going to produce great marketing. That's my personal belief. And so you have to create an environment where marketing teams can take risk. Those risks sometimes will pay out tremendously for you. There will be times your marketing team, they'll fail. Matter of fact, Dan, I just. We just did something on our team. We ran a new ad campaign with a newsletter. We took a small bet on it. It didn't work. You know, we failed at it. But I am in an environment where we can take those kind of experimentations and everything is great. So I would say first thing out of the gate when you're rolling out AI is you've got to set the tone that, you know, this is not. This is. We are doing this together. There's no fear in this.
15:20
We're. We're.
16:47
We're creating an environment where we can take risk, we can try things without judgment. And then you move into. And I'm a broken record here, dad. You gotta start with where the problems are. You gotta use real work. This is. We're moving to outcomes, not playing around with new tools.
16:48
I remember two years ago, I was talking to one of my guests named Bart Kaler, and I think we both had this epiphany that we were using AI, but it really wasn't helping any of our core stuff move faster or better. It was really adding on new stuff that we could do that we're like, oh, we've always wanted to add on these extra things. You know what I'm saying? It just added extra stuff to do. It was faster and it was stuff we wouldn't wanted to do. But now I think it is getting good enough that actually starting to chip into the things that we do regularly. And I think you're right. Always aligning it back to like, does this create value for the customer? Is this actually helping us do better marketing? Is this actually helping us do the core thing or helping us move to the objective? Otherwise it's easy to get lost in that. And so you keep bringing it back to that point. But I'm like, but it needs to be brought back to because we can all have a little bit of fomo, a little bit of, ooh, shiny going on and just keep chasing. Chasing the next cool thing. Yeah, kind of like infographics. I know we make them. But do you really need them?
17:11
Yeah, I know we were joking about, you know, me coding something for our development team. I actually think in marketing teams that's one of the best use cases, like even with design, of being able to collaborate so much quicker. I always say marketing is a team sport. Well, you know, if it is a team sport and I can hand off the football faster, you know, knowing what the play that needs to be run is, that's incredibly helpful for us. Producing an outcome faster and scoring a goal. I don't know why I'm making this football analogy other than I've been watching too much football. Dan, it's that season.
18:06
We're getting close. So have you run into a situation where you've hit pushback from team members over AI or maybe from other vendors or people that you're working with around job loss, quality worries, or just. Just extra work?
18:44
Yeah, I would say the place that I see it the most is people want AI intelligence or signaling to be 100% accurate. And listen, we all want accurate, but the reality is they don't really understand what AI is at its core or what's different about it than say, you know, sas, what the output that my CRM with like manual entry data, deterministic data is and then what AI produces and how those things are different. And so it. In some ways I think the pushback is understandable, but I think if you understand the core of what your platforms or your tools are based on, it helps you understand, oh, I'm not going to get a 100% signal or prediction every time. And so I have to change how I think about this technology and how I utilize this technology. And so this is a very business operations type answer. But that's actually where I see a lot of pushback, is not really understanding AI well, you know, I think the, if you're not deep in this, it's easy to look at it just like deterministic software, which it is not. And you and I know that and probably everybody listening. But that is where I see the rub a lot of time. And I think it causes some of that pushback. And again, I understand the pushback, but I have this deep desire of if we can educate more on how AI is different than a lot of the technology that we've used previously in our roles. I think it helps us get past that roadblocker. Do you ever see that? I'm just.
19:01
All the time. All the time. And it's funny because I'm like, if we actually made it more Deterministic, it would suck. It would lose all of its actual creative ability. And now with the reasoning models, they're so freaking good that, like, I'm. I'm doing some market research for a company to refresh on, like even making a standard document of voice a customer. I'm just trying to find like, how do customers talk about this brand? Yeah, it's actually research I did a year and a half ago, and I was using AI then, but now that was before reasoning models. That was before deep research. That was before it was able to actually, like, run through tons of information and summarize it. Well, so now it's totally different. I remember. Now I'm comparing it to market research I did a year and a half ago. And it's better because it can go scour the whole Internet for every single testimonial and find the exact language. And I know because I'm like reading this person's books that I'm like, oh, yeah, they, this, this is the voice of the customer because I can hear the founders leading in it, but I could still hear the individual ones. And of course, I always write my deep research prompts in such a way be like, show me with links.
21:07
Yeah.
22:13
To spot check it if I need to because I don't want it to. It can, it can make stuff up. But if you always write your deep research or your thinking prompts with like, hey, and show, show, summarize your work. Like, show me the actual quotes from the customers with the link of where they said it. So I can go check to see if that's what they actually said and spot check it, then it's moving along at a much higher quality. But you do have to know what it can and can do. Yeah, you have to know like, hey, it's still actually it. I asked it to give me an acrostic for the word prey and it's like, oh, yeah, but pray only has five. Prey has five letters in it and you're trying to get it down to 4. And I'm like, no, chat GPT 5.2, thinking you were wrong. Yeah, like, how did you get that?
22:13
I am curious. I have a question for you. So I do think when we, like at an individual level, you know, we're using chat GPT, we never, you know, it's easy, like in a big research project like that to get lost in, oh my gosh, is this actually right or wrong? You know, and like evaluating. But I think in general, when we kind of see the responses, we know that it could be wrong, you know, But I think when we take the leap From Gemini or ChatGPT into AI.
23:01
Native.
23:35
Business platforms or tools, we kind of lose that lens in a sense. It's like our expectation of ChatGPT being 100% accurate, we don't have. But once we move to enterprise, we put this extra layer of it needs to be 100% right all the time. It's almost Dan. I mean, I'm not kidding. I have been in many executive team meetings where the CFO brings in, you know, the. The data from, you know, the last round of financials, and it just becomes a debate about if the numbers are right or wrong. And so I wonder if we just like to not actually deal with issues and just debate if data is right or wrong. So maybe it's just a business thing, but it's interesting. With ChatGPT, we don't have such a stringent. We know we're like, this isn't. But when we move into enterprise, we kind of lose that lens. I'm just curious if you have an idea why we don't hold that as we move into other AI tools.
23:38
I mean, I think marketers have been trained to be more data driven to the point where it just became stupid. Yeah, I don't know what to say other than like, we became so data driven that we couldn't actually, like, have this thing called discernment anymore. It's like the numbers needed to prove interesting. Like, if the numbers could prove everything, nobody. Like, there would be no art in this. And there's tons of art and guessing and discernment and gut feeling when it comes to all kinds of business things. Do we want to look at the data? Of course. To inform our gut, to inform our discernment and decision making. Isn't that why you hired someone with. Who's a senior level, because they didn't come with the data, they came with the discernment. And that's still what drives things with AI. In fact, that's still what makes someone better at AI is usually that taste and that discernment, because I can look at something and know if it's a good marketing plan or not for a specific company. I can discern that with ChatGPT. Ever get a response from Chat GPT and you're like, nope, nope, that wasn't it. And you realize you're like, I'm gonna go reef. I'm gonna refine that prompt. I'm gonna go hit that edit button. I'm gonna refine it. Because that wasn't It. But sometimes you, you enter it and you're like, whoa. You're like, that was a plus. Plus. I wasn't expecting it to be that good chat. Digital high five. And it's like, you're welcome, you know. But that's the discernment and that's why it's so important. And I think that plays a bigger role in AI because AI can handle with the data better, honestly than we can a lot of times. But I think data driven, just better.
24:51
So wise. Yeah. We have become so data driven, but it's how much data do we need to know we gotta go fix a problem or there's a, you know, there's something that we gotta go make action. Do we have to get it to 100%? No, probably not.
26:18
Yeah. I mean even look at the. I want to say I can't remember which book this is but like some of the basic financial books, like the Millionaire Next Door, I don't know if it was that one. But there's like some of those one those really basic books. Basic like entry level finance books. I remember even hearing these guys about day trading and stuff and they're like really, like sometimes you just have to know a lot about the industry and kind of be in it. But then like just hear normal people talk about it. If you're hearing normal people talk about it, it's probably a bigger deal than you think. And maybe you're catching it before all the data driven people. Who is day trading over in Manhattan.
26:34
Yeah.
27:05
Are actually picking up on it because you're picking up on a level of discernment that the numbers aren't showing yet. Yeah. Because by the time the numbers show it and have proof while the stock's already up high.
27:06
Yeah.
27:17
Right. So that's the most data driven industry of like all time. Right. But like still the people who know how to play the game are using discernment and experience in order to make smart decisions on it. I think it's the same thing with marketing and we've been so data heavy because the market, the Martech companies have pushed it so hard and done a good job marketing. And we'll probably swing back the other way now. Yeah, that's my guess. Yeah.
27:18
So good.
27:39
So with all teams, everybody's got AI as like an agenda item this year. We all have it. It's going to be a thing every single team. What are some of the basic rules that you see teams need to apply early on? Like what kind of guardrails around brand privacy approvals do you See helping teams, but also slowing them down.
27:42
Yeah, it's a really good question. I mean, I think slowing them down is what I talked about earlier of just, you know, it is that fomo, there are all these new tools, new gadgets, new trinkets, and it's really focusing on outcomes, real problems like things you've got to solve or things that you're just like, we could do this a lot faster. Let's utilize AI to do it. I would recommend keeping the guardrails as low as you can, especially to produce, let's say, a V1 of something. You know, you may, you may open the guardrails completely for, hey, to get to a V1, go to town before we publish something, we may need to approve it, but really allowing the team to have some room to move. Um, I think, of course, we've gotta be careful. If you're in certain regulation, you know, regulatory environments, the bar is much higher. I would say run it as much like a startup as you can. And as someone in a startup, I will tell you the guardrails are low. And I would say make sure that you practice what you preach. So if you're wanting to have your team bubble up things, find things, find areas that they can utilize AI for, you should be doing that as well. So if you want them to be, you know, kind of that environment of not afraid to bring a new tool or try a new custom GPT, you should be doing that and bringing it back to your team. I would also say when it comes to guardrails, hopefully you have some kind of corporate structure that you can just live within. And then lower the guardrails for your team to just work within your corporate structure and move, move quickly. Again, lower the guardrails, especially for things that you're producing, you're not publishing externally.
28:06
Remember getting a word of advice from a senior executive once, and he's like, never apply big company rules to small companies.
30:21
No.
30:28
Sometimes you think it's wisdom. You're like, we should have pass. We should all have password managers and password protections and really strong encrypted passwords. You're like, we're a team of five. We don't need that.
30:29
Yeah, yeah. I mean, it's just like, you know, I've worked in much larger companies where I've had much more extensive brand guides. Even those brand guides are not appropriate for our startup stage. It's too much. It's too high of a guardrail. And. And I would say, you know, that that is a good analogy for what we're talking about with AI to put appropriate size guardrails for the appropriate size company and regulations that you're working within. And I would default to the lowest guardrails that you can. I also would say if your company has not gotten Enterprise ChatGPT, you know, if you're using Google Suite, if you're not using the Gemini Enterprise, that is a huge unlock and allows those guardrails to be lowered because you're working within your own protected ecosystem. Dan, I'm sure you're a fan of that, but I'm telling you that was such an unlock for our team and we're an AI company.
30:38
Like if you're using Gmail and you don't trust Gemini, something's wrong, right Guys, all your data is already in Google Drive. At least give your employees access to use it in Gemini. Because if you don't trust Google with Gemini, then why are you using Google Drive?
31:49
I mean it essentially becomes your knowledge management. The best everywhere I've ever seen. Their Google Drive is a mess. It is now it is like a secret weapon for actually finding the knowledge.
32:03
Your team needs in your in baked into Drive. Have you tried it yet? It's amazing.
32:17
Oh yeah, it's.
32:22
It is. It's like what search should have been a long time ago but it can do stuff now.
32:23
Yeah, no, no, this is like one of those things that you're like oh finally, you know, finally. I'm still waiting that for my Alexa. Just so you know, it's still not happened yet, so just keeping it out there Alexa. I'm watching.
32:28
Well, I did have a chance to meet with some enterprise companies at a conference a couple of months ago and I learned something really. There was, there was upside, but a lot of downsides of working at the really big enterprise companies is that they don't. They're not even allowed to use Gemini. They're not in Google, they're not an Outlook and they're not only as any AI except for some super secure privacy focused in house enterprise AI that's like a year and a half behind what current models are. Yeah, that was a lot. A lot of the enterprise companies are stuck with that. But they also have bigger budgets to be able to afford things that can do some really impressive AI things at scale. So there's that too. So.
32:45
But yeah, as someone that sells an enterprise AI native platform, I would be glad to talk to them about bringing in intelligence on their clients.
33:23
That's it. That's the shameless. If you got budget like that's you can afford some of the coolest AI tools out there that are like out of reach for all the small companies. We're all using Chat GPT, which is fantastic and is an advantage of in itself because we can try things faster. But there are some really cool tools, kind of like what you guys are working on.
33:33
Yeah. We're using Chat GBT because we literally don't have the people you have to do the thing.
33:50
So tell us a little bit more about KnownWell and even your podcast over at a high know how. So I think a lot of people from the show would love to take a listen there too.
33:56
Yeah. KnownWell, we created specifically for B2B services firms. We talked to so many firms, Dan. They were all. It didn't matter if they were small or an enterprise company. They were all tracking the health of their clients on a spreadsheet. A red, yellow, green. Their team would come in and I was always shocked. I would think, no, this company, they're so successful, they figured this out. And we realized we are in a new era with AI, that we could actually track this now utilizing AI and have a platform where on Monday morning you're not having to come in and call all the people or check the spreadsheet. You just know immediately, real time, the objective health of your clients, your entire portfolio. And so that's what we've built at noonwell. We're extremely excited and proud of it. We'd love to give anybody listening a look at it. I know for all the marketers listening, the top of the funnel is always the emphasis. But if you got a leaky bucket, and we always hate that leaky. But that leaky bucket on the marketing side, it's just hard. It's just more pressure for sales and marketing because the answer is always just pour more water in.
34:04
You know, there's nothing more painful than crushing it and marketing. And then sales is doing well. And you're like, yes, we crushed it. And then like the churn so high that you're like, and you need to crush it double now because we can't hold on to customers. You're like, exactly. And I hate myself.
35:24
Exactly. Well, and it's, you know, Dan, it kind of comes back to what we talked about earlier. Like, we have been drilled on the data, data, data, data, data. We took sales. These Sales people think 15 years ago they just went out and did their thing. They were like lone wolves. Or you just. They would come back, they sold something that your company doesn't even make. You know, it just was like, it was all about relationships getting on the golf course, that all changed. We changed that system from being about like relationships and like vibing to data. Now for professional service firms and B2B SaaS companies, that same transition has to happen on the post sell side of things. It can't just be about vibes, you know, can't be just about how we feel like the relationship is going. It has to be driven from data, from intelligence, to know where do we actually stand. And so I'm really excited about that transition that can pull all the way through the entire customer life cycle and hopefully take a little pressure off us sales and marketing people.
35:38
I think there's probably a whole conversation just around how AI can take the qualitative and turn it into quantitative and not so that we become so data driven that we can't make a decision without numbers, but so we can actually be properly informed, hold the right systems accountable or the right people accountable. I think that's what it's all about.
36:50
Absolutely.
37:10
Thank you so much for joining me again on the AI driven Marketer.
37:11
Thanks for having me.
37:13