Everyday AI Podcast – An AI and ChatGPT Podcast

Ep 748: Plugins, Microsoft’s AI Comeback and New AI Video. 7 New AI Features You Should be Tracking

30 min
Apr 3, 202616 days ago
Listen to Episode
Summary

Episode 748 covers seven emerging AI features including ChatGPT plugins in Codexa, Google Gemini's memory import tools, Slack's 30+ new AI capabilities, Perplexity's artifact generation, Microsoft's dual-model research agents, and Google's cost-effective Veo 3.1 Light video model. The host emphasizes that AI updates are shipping faster than ever, requiring a new Friday features format to track smaller but impactful releases.

Insights
  • ChatGPT plugins in Codexa represent a shift toward AI super-apps that combine multiple tools and local file access, positioning Codexa as superior to browser-based alternatives for complex knowledge work
  • Microsoft is aggressively catching up in AI by borrowing best practices from competitors (Perplexity's dual-model research, Anthropic's Claude co-work technology) while leveraging its enterprise distribution advantage
  • AI video generation is becoming commoditized through cost-effective models like Veo 3.1 Light, enabling widespread adoption across platforms and use cases previously priced out of AI video production
  • Enterprise platforms (Slack, Microsoft 365, Salesforce) are evolving from communication/productivity tools into autonomous agent ecosystems that handle multi-step workflows without step-by-step user prompting
  • Portability of AI memories and chat histories is becoming table-stakes, with Google, Claude, and others enabling users to switch platforms without losing personalization or context
Trends
AI super-apps consolidating multiple tools and capabilities into single platforms rather than point solutionsDual-model and multi-agent research pipelines becoming standard for accuracy and hallucination mitigationEnterprise AI shifting from assistant-based (step-by-step help) to agent-based (autonomous task completion) modelsCost-effective AI model variants enabling production-scale deployment across consumer and developer platformsInteroperability and portability of AI memories/contexts becoming competitive differentiatorsLegacy tech giants (Microsoft, Google, Slack/Salesforce) acquiring AI startup capabilities through partnerships and feature borrowingReusable AI workflows and skills enabling organizational scaling without individual prompt engineeringAI video generation moving from experimental to infrastructure layer powering third-party applicationsMulti-step knowledge work automation (research, formatting, delivery) consolidating into single toolsFrontier/early access programs becoming primary distribution channels for enterprise AI features
Companies
OpenAI
Released ChatGPT plugins for Codexa with 20+ curated integrations including Gmail, Google Drive, Slack, Figma, and No...
Microsoft
Launched Copilot research agents (Critique and Counsel), Copilot co-work, and announced development of open-source al...
Google
Released Gemini memory import tools, Veo 3.1 Light video generation model, and new research agent capabilities
Slack
Announced 30+ new AI capabilities including meeting transcription, reusable skills, MCP client, native CRM, and voice...
Anthropic
Claude co-work technology powers Microsoft's Copilot co-work; Microsoft is major investor; Claude's features being ad...
Perplexity
Added artifact generation for presentations, spreadsheets, dashboards, and websites; features being adopted by Micros...
Salesforce
Parent company of Slack; now bundles Slack with every Salesforce customer; Slack bot integrates Salesforce CRM and Ag...
Figma
Available as ChatGPT plugin integration in Codexa for design workflow automation
Notion
Available as ChatGPT plugin integration in Codexa for knowledge management and documentation
Claude (Anthropic)
Used in Microsoft Copilot Critique agent to review OpenAI-generated research reports for accuracy and citation quality
People
Jordan Wilson
Host of the Everyday AI Show providing analysis of weekly AI updates and features
Quotes
"Codecs is not just for coders for developers. I've been saying all along, they should have released this as a separate app"
Jordan Wilson~8:30
"If you need something right, if you are 100%, if you have challenging knowledge work or challenging dev problems, you go to codecs"
Jordan Wilson~10:00
"Microsoft, I'm just saying they've been they've been cooking"
Jordan Wilson~18:30
"Before this researcher inside of copilot used a single model and to end with no built in quality check. So hallucinations and citation gaps, where while those fell on the user"
Jordan Wilson~20:00
"AI moves too fast to follow, but you're expected to keep up. Otherwise, your career or company might lag behind while AI native competitors leap ahead"
Jordan Wilson~15:30
Full Transcript
This is the Everyday AI Show, the everyday podcast where we simplify AI and bring its power to your fingertips. Listen daily for practical advice to boost your career, business, and everyday life. This week's newest AI features kind of feel like throwbacks. For context, we have to go back to last week when the hottest new AI updates from that week were all about controlling agents from your phone on the go. Very futuristic. But this week comparatively might feel kind of retro. I mean, chat, gpt plugins are kind of back and Microsoft and Slack are suddenly at least temporarily the hottest names in AI this week. But don't worry, there's still plenty of bleeding edge AI tech that came out this week that you might have missed. And if you did, our new Friday features series will get you caught up quickly on the smaller AI updates that can make a big difference in your day to day work. All right, so stick with me for the next 20 minutes. And here's what you're going to leave with and what you will learn. So first, why we'll probably start seeing a lot more AI video everywhere. Thanks to Google, how a legacy tech giant is borrowing from AI startups to bring new features to a struggling ecosystem and how Slack is using AI to go from a messaging platform to a do everything. Welcome to every day AI. If you're new here, what's going on? My name is Jordan Wilson. And while we do this thing for you, it's a daily live stream podcast and free daily newsletter helping everyday business leaders like you would be keep up with the nonstop updates that are happening in AI. I tell you what matters. Why? How to use it to go grow your company and your career. So thank you for doing it. Normally, I don't sound like this. So if you're on the podcast, yeah, my allergies are kind of wild. But if you are like, wait, what did he say? It's all going to be in our newsletter. So make sure you go to your everyday AI.com. Yeah, this is for all the people that when I was sick, when I've been sick for the last like two months and people are like, oh, your voice is AI. Well, can AI, you know, video avatars, you know, sound this congested? I don't know. So let's look, let's get started. So if you are new here, well, here's kind of our weekly setup and why we're doing this new Friday feature. So on Monday, we do the AI news that matters and that's turned into well, a bunch of policy stories, big tech drama, trillion dollar deals, right? And then on Wednesdays, we'd go hands on and we dive deep onto one platform and we do a hands on demo. And what I've realized over the past three years of doing every day AI is that left out probably like 90% of the useful AI features that you can use to improve your day to day, because starting in 2026, the rate at which the big companies are shipping, it is impossible to keep up with even if you have a daily podcast. So that is what Friday features is all about. So if you do like this show, please let me know, you know, leave a leave a hot or not just say hot or not. So I know if I'm going to keep this going over the long run. All right, let's get into it. This week, we're talking plugins, Microsoft's AI comeback and new AI video and seven new AI features you should be tracking. So we're going kind of live ish. All right, we're bouncing around here. If you, if you are listening on the podcast, I'm sharing my screen now, nothing overly visual, but if you ever want the video version of the podcast, you can always go to our website, watch that and 750 other episodes for free. So first update, plugins are back, baby. Not in the same way they've always been, but open AI did just release plugins and they are now live in codecs. So here's what it is, who has access and why it's useful and who will find it valuable. So 20 plus curated plugins are now live in the codecs plugin store. That includes plugins like Gmail, Google Drive, Slack, Figma and Notion. So right now, well, you do have to obviously be a codecs user. And this is in the codecs app, the CLI and the BS code extension for Chatchy, plus pro business, EDU and enterprise users. So you do have to be a paid Chatchy, user and a full plugin directory for external submissions has been referenced, but it is not yet confirmed. So chances are, we're going to see a lot more plugins in the near future. So if you're confused, what a plugin is, because yes, Chatchy, PT used to have plugins, and then they had connectors, and then they got kind rid of connectors, but they just rename them to apps. But now also chat gbt has skills. So confusing, right? We technically have plugins, connectors, turned apps and skills. So plugins are essentially reusable workflows with skills and app integration. So it kind of combines skills and apps and reusable workflows. So the way that's open AI describes it on their blog is they say plugins, bundle skills, app integrations and mcp servers into reusable workflows for codecs. You can extend what codecs can do. For example, you can install the Gmail plugin to let codecs read and manage Gmail. You can install the Google Drive plugin to work across drive docs, sheets and slides or install the Slack plugin to summarize channels or draft replies. So here's why it's useful. Well, I think it's turning codecs into what we're essentially going to see in the future chat gbt super app. And if you are a long time listener of this show, what I'm saying now is going to sound like a broken record. Y'all codecs is not just for coders for developers. I've been saying all along, they should have released this as a separate app in sure enough, they are rolling chat gbt their browser atlas and codecs into one super app, they did confirm this this week. But I do see plugins maybe being one of the ultimate stickier kind of factors, I don't think that these are going to go away when we get a super app, because this brings kind of that co work, right? So Claude, entropic Claude's co work has been like, deservedly so, very popular and very viral over the last couple of months. But I think what we're seeing now with codecs and plugins, this is bringing some of that day to day knowledge work utility to codecs, because right now, you can do a lot of these things inside chat gbt, even with the chat gbt apps, they don't all well, number one, they don't have access to your local files, your local computer. And that's the big difference here with codecs, because codecs can it can access all your local files, you know, read, write, move things, run in the terminal, right? So that's the big difference here. And why I think plugins are going to be a big deal in codecs, or the super app or whatever it is. So who's going to find this valuable right now? I think obviously, engineering teams, those that are already using codecs a ton. But I think this is also going to be extremely useful for normal chat gbt users, non devs, non software engineers, who are experimenting in codecs, or if they're if they like co work, and they're like, wait, chat gbt can now do this via codecs. Yes, you can I highly, highly suggest you check out codecs, if you haven't already. Yes, Claude code, Claude co work is better at certain things. It's usually faster. It's way better at front ends. But if you want something done correctly, right, take this from me. Yes, this is annoying. But right now, I'm currently running Claude, Claude code and codecs, right? Codecs, if you need something right, if you are 100%, if you have challenging knowledge work or challenging dev problems, you go to codecs. If you need front end, or if it's not very hard, you can go to Claude cope. So plugins, I'm extremely bullish on and I think you should check them out. All right, our next AI feature you might have missed. Well, now it's much easier to bring your history from any AI into Gemini. So we've actually seen this from Claude, a couple of months ago, and now Google shipped the same thing. So there's two new tools that Google just shipped for Gemini one is the ability to import your memory from other chatbots. Essentially, Google kind of gives you a copy and paste prompt and then you can put in the results. And then there's also an import chat history. And this you can upload a zip export or full conversations from chat GPT or Claude. So Google is also rebranding past chats to memory to reflect Gemini's kind of evolving personalization layer. So pretty, pretty big news here. And I'm actually kind of shocked that it didn't take longer for other companies to do this because Anthropic was the first one. And then people just figured out, wait, this is just a prompt that you put in and extract all this information and copy and paste it. Right. So Google going the same route with two different new features here to transfer your history from any other chatbot into Gemini. So the way that they kind of market this is they say get your memories in just a few clicks, transfer your memories from another AI app is pretty easy. And Gemini can walk you through it. It can also save your save your preferences, transfer your chat history, and here's who has access. So it is available to all accounts free and paid right now not available in some of the EU countries for privacy reasons. But I think this is super useful because if you are maybe looking to switch from co pilot or switch from Claude or switch from chat GPT, and you want to try out Gemini, you probably don't want to start from scratch. And I do think that this is going to become the norm. I think in the future, we're just going to have a kind of modular memory system that we can actually plug into, right? Think of something like, you know, Google Drive, or box, or SharePoint, right? I do think that there's going to be a dedicated version of this. That's just for your skills, your markdown files, you know, plugins, connectors, etc. that you can just bring around and plug in and you won't have to have, you know, a, you know, memory in each one. But until that time, I do think that this is going to become very common. And it's a good move from Google, obviously, making it available for free. So this is good if you're just trying to try out Gemini, or if you put a ton of time into personalizing chat GPT, or Claude, this is a great way you're not going to get 100% of the way there in a simple copy and paste. But it is going to, I think, close the gap and help you get better results out of Gemini. So also for business users that are that are evaluating kind of multi platform AI strategies and want to, you know, kind of trial Gemini risk free. It's a great way to do that. One thing to keep in mind, zip uploads are capped at five gigabytes per file, five per day, and re uploading the same files over writes previously imported chats. So let's get going on to our next update. And this one was a big one, it was actually more than 30 updates rolled into one. Yeah, are some of the platforms from the late 2010s, early 2020s, roaring back, well, maybe because Slack just made a huge splash. So kind of showing off their landing page here, I'll read a couple sentences. So here's what's new with Slack. So they announced more than 30 new capabilities that take Slack bot from a personal agent to the ultimate teammate. So they say together, they don't just extend what Slack bot can do. They redefine what Slack is for every agent enterprise, everything shipping is built around the same idea, more impact per person pertin. So some of the things that are new, a new meeting transcript and note taking capability. If you're doing those meetings inside Slack, they have reusable AI skills. There's a new MCP client. There's native customer management built into Slack bot, right? So obviously, Slack is kind of owned by or the parent company of Salesforce. So you get some of sales forces, CRM and agent force ask capabilities in Slack. And there's a couple more. So aside from that, you have just let me look at my list here. So reusable AI skills, the MCP client integration meeting transcription and summarization, native CRM memory and personalization, deep research and also voice input. It's a lot. So this is rolling out in the coming months. Not all of these are live today, but they have started to roll out first with business plus and enterprise grid plans, right? So also every Salesforce customer now gets Slack bundled in. So here's why you it's useful. Well, it's now much more than just a messaging platform. So before this Slack bot inside of Slack, right? So this is kind of their AI agent inside of Slack is called Slack bot. It was more of a basic kind of reminder or notification bot. But now it can well, like a true agent team, it can draft emails, schedule meetings, transcribe calls, update your CRM records and run reusable multi step automations. So a pretty big step forward, I would say from turning Slack bot into this kind of like AI chat bot that you would just chat with. You know, it was helpful at times into now more of like an agentic teammate that can complete tasks for you. So also the reusable skills, that's big, right? So that means a team can define a workflow once and then share it across the entire org, instead of everyone writing their own prompts. So here's who's going to find it valuable. Well, any enterprise team on Slack, right, which is a lot from startups, small businesses, enterprises, a lot of people are on Slack. Also, if you're on Salesforce, well, now you get access to all these things. Also, small businesses are kind of targeted with Slack's new built in CRM as a gateway to full Salesforce. So maybe you've been tickering with a couple CRMs, not super happy, but you do use Slack. Well, now you have kind of the Salesforce light built in. Alright, our next update, this one, y'all, I almost missed this one, which is hard for me. This was a footnote in a release from perplexity. But I think it's actually pretty big. And here's the new updates. So now, inside perplexity, deep research and pro search, users can generate presentations, spreadsheets, dashboards, websites, and other structured outputs directly inside the product. That's pretty big, right? These are the things that well, we spend time doing a lot of times, right? I think a lot of what we human knowledge workers do in the age of AI is we fight against walls of text, and we have to, you know, do a lot of copying and pasting a lot of human duct taping between different AI systems, because ultimately, the artifacts that we're all using are things like dashboards, spreadsheets, presentations, websites. And it wasn't really until late 2025, early 2026, that large language models got good enough that we could actually use these things in production. And well, now you can do that inside of perplexities, deep research and pro search. So this kind of collapses, you know, what was previously a multi tool pipeline. So first, you would have to research that export then format. Now you can do it just all in one. So right now, this is available to perplexity pro and max subscribers. So you do have to be on a paid perplexity, perplexity plan, and enterprise users also get some additional pre built computer workflows for website generation and audits. So here's why it's useful. Well, before deep research was just a wall of text, perplexity is deep research has gotten much better than, you know, what it was in 2024 2025, I'd say probably late 2024 and early 2025. I would tell people not to use perplexity, the same thing with croc, because the hallucinations were so bad. I think croc is still struggling with hallucinations. I think 4.2 is a little better with the multi agent workflow. But I think perplexity is much better now in terms of when you're, you know, not hallucinating sources like it had previously done, pretty prevalently. And now, instead of just staring at that big wall of text inside of deep research, you have the ability to create different artifacts. So who's going to find this valuable? I mean, a lot of people, but consultants, analysts, marketers, project managers who regularly have to do a bunch of competitive market research like that and turn it into client facing deliverables like decks, free things, or competitive analysis, and just enterprise teams that are also using computer workflows. All right. AI moves too fast to follow, but you're expected to keep up. Otherwise, your career or company might lag behind while AI native competitors leap ahead. But you don't have 10 hours a day to understand it all. That's what I do for you. But after 700 plus episodes of everyday AI, the most common questions I get is, where do I start? That's why we created the Start Here series, an ongoing podcast series of more than a dozen episodes you can listen to in order. It covers the AI basics for beginners and sharpens the skills of AI champions pushing their companies forward. In the ongoing series, we explain complex trends in simple language that you can turn into action. There's three ways to jump in. Number one, go scroll back to the first one in episode 691. Number two, tap the link in your show notes at any time for the Start Here series, or you can just go to start here series.com, which also gives you free access to our inner circle community, where you can connect with other business leaders doing the same. The Start Here series will slow down the pace of AI so you can get ahead. Our next one. This one, speaking of perplexity, Microsoft is borrowing a couple features from perplexity and Claude. All right, the Claude one makes sense, since they're a big investor in Anthropic, but kind of surprised here with some of the updates. So here's what we got. Well, technically, three. So let's go over the first to first, that is the new Microsoft Copilot researchers critique and counsel. So critique uses a two model pipeline where GPT from OpenAI drafts a research reports and then Claude reviews it for accuracy, completeness and citation quality before delivery. Yes. Yes, yes, yes, y'all. Microsoft, I'm just saying they've been they've been cooking. All right, I'm gonna get to that more here in a minute. So counsel though, runs both an open AI and anthropic model simultaneously on the same query and then produces two independent reports, then a judge model highlights where they agree, diverge in what each uniquely contributes. So these are both kind of perplexity features that we've had. There's other platforms that have had this but perplexity was kind of what quote unquote, one of the bigger names. So this is pretty big right now for Microsoft users. So here's who has access. So both of these research agents are available now through the Microsoft 365 Copilot frontier program for anyone with a copilot license, which I don't know the exact numbers of individual copilot licenses on the frontier program. But I would assume it is thousands of enterprises across the world. So I'm guessing tens of thousands or maybe hundreds of thousands of people now have access to the council and critique researchers. Also, it is important to know that tenant admins control whether third party models like Claude are enabled and critique runs automatically when auto is selected in the model picker when you are using that agent. So here's why it's useful. Well, before this researcher inside of copilot used a single model and to end with no built in quality check. So hallucinations and citation gaps, where while those fell on the user. And I think more than anything, this is just going to be a huge time saver. Right? Because this is something I do manually all the time. So critique also adds a new dedicated review pass that they say catches errors before deliveries. So there is a new kind of Draco deep research benchmark and Microsoft's new agents took home the top score over perplexity, open AI and anthropic. So who's going to find this useful just about any knowledge worker, I'd say so if you're doing complex research where accuracy matters, you're going to love this. All right, another one from Microsoft. This is copilot co work. So technically not new, but all it was before was a very small beta, but now like critique and counsel, this is rolled out to the frontier early access program. So again, presumably now thousands of organizations are going to have access to the copilot version of co work. Yes, this is essentially, I won't I wouldn't call it a white labeled version of anthropics, Claude at co work. But this is powered by the Claude co work technology again, Microsoft big investor in Claude. So this is kind of their version of co work powered by anthropics models. So this handles long running multi step tasks autonomously inside Microsoft 365. So essentially users can describe the desired outcome, the system then creates a plan, reasons across tools and files and executes step by step while the user monitors and can intervene at any point. So right now available exclusively through the Microsoft 365 copilot frontier early access program. And like the critique and counsel, it does require a copilot license plus that frontier enrollment. Here's why it's useful. Well, copilot before did require that step by step prompting inside individual apps, but co work runs the entire workflow autonomously in the background. So it kind of shifts copilot from being the assist you at each step to it just does the job while you go do something else. Who's going to find this useful? Anyone, right? Your everyday knowledge worker. If you are working a lot inside of Microsoft 365, you know, working across PowerPoint, Excel, Word, doing research, right, this is going to be for you. So like I said, it is built on technology from anthropics, standalone Claude co work product, but Microsoft's version runs in the cloud, rather than locally on the user's machine. That is the big difference. So this isn't going to control your local machine, like Claude copilot, or sorry, like Claude co work, but it will work with all of your files in the 365 cloud. And here's why I'm saying, speaking of working autonomously, here's why I'm saying Microsoft is straight cooking y'all, their tasks feature, which I covered two weeks ago is pretty amazing. Right now is actually blown away. That's very much like anthropics dispatch technology where you can text copilot, and it will autonomously do things and schedule tasks for you. That's really good. You have the new critique and counsel, which at least on paper in my demos look amazing. And then you have copilot co work, which we know is going to be in a hit. And then we also saw news that Microsoft has kind of elevated a senior leader to start working on their version of open claw. So that's not available yet. But they are at least publicly acknowledging that they are going to have their version of open claw. That's breaking ish news. That's big. One that does come available. We'll talk about a lot more. And when it does become official, all we know right now is they've hired and promoted someone internally to work on that. All right. And last but not least, we're going to see a ton more AI video. That's because Google announced a new update to their VO platform. And this is VO 31 light. All right. So we already had VO 31. Now we have VO 31 light, Google's new most affect cost effective AI video generation model. And here's why I think we're going to be seeing a ton more AI video. That's because on the API side, this cost less this cost less than half of VO 31. It is fast with the same generation speed. And it has a lot of the same features, right? It supports text to video image to video in landscape and portrait. So who has access? Well, it's available now on the Gemini API and Google AI Studio via a paid tier. And right now this is developer facing and not yet available in the consumer Gemini app. So it's also VO 31 fast. So this is different, not 31 light but 31 fast. Google said is getting a price cut next week. So here's why it's useful. Well, before this, the per clip video generation, well, they would get kind of costly for production scale use cases. So now VO 31 light cuts that roughly in half. And this completes kind of the now free tier VO 31 family. So you have light for cost or volume, you have fast for balance, and then you have the flagship or the full 31 for max quality. So this gives developers a clear cost quality tradeoff. So if you're not sure or kind of confused how this works on the back end, well, good chance that many kind of AI video platforms that you see out there and you're like, wait, what's this powered by, right? Or if you're using a creative platform and it starts to offer video, well, good chance that it's probably built with VO 31 light, and they're just using Google's API. And that's what your subscription or your credits, right? A lot of times, companies just upcharge you on the credits for using it in their platform, right? A lot of these platforms that have, you know, five, six, 10 different AI video generators. Well, I'm guessing that VO 31 light is going to be a favorite in some of those platforms. So who's going to find this valuable? Well, anyone, but I think especially developers that are building video heavy apps at scale, right? Ecommerce product previews, localized ad variants, social media content automation, right, rapid prototyping, or just any team that was priced out of, you know, AI video generation at scale before. So right now, though, this doesn't support 4k output or the extension feature that's only available in the flagship VO 3.1. And right now, clip durations are customizable at four, six or eight seconds with costs adjusted accordingly. So also, I don't know if this is coincidental or not, but this did launch one week after open AI shutdown, Sora, which was reportedly burning like $15 million a day, while reportedly only generating like $2 million in lifetime revenue. So big AI features, I hope you feel caught up now. So let me know, is this show hot or not? If you're listening on LinkedIn, let me know, drop, you know, if you do want it to keep going on, say hot, if you're listening on Spotify, same thing. I think this is the fourth. I did say I wanted to do this at least five times, see if you guys are liking it, seeing if, you know, this is a highly downloaded show if people are listening all the way through all those metrics that Codex is telling me I should pay attention to, right? So if you do like it, let me know, drop a hot or a not, but then also go to your everyday AI.com, sign up for the free free daily newsletter. Thanks for tuning in. Hope to see you next week and every day after that for more everyday AI. Thanks y'all. Go break some barriers and we'll see you next time.