ACCESS

The future of AI might be on your finger

75 min
Apr 2, 202617 days ago
Listen to Episode
Summary

ACCESS hosts Mina Thami, co-founder of Sandbar, to discuss the company's AI-powered smart ring that captures thoughts and responds in the user's own voice. The episode explores how wearable AI interfaces could become the primary way humans interact with AI, while also covering OpenAI's strategic shift toward coding productivity tools to compete with Anthropic's Claude.

Insights
  • Wearable AI interfaces represent a fundamental shift in human-computer interaction, moving from text-based inputs to conversational, always-accessible computing on the body
  • OpenAI is deprioritizing creative tools like Sora to focus on coding productivity as a competitive response to Claude's market traction and revenue potential
  • The success of new wearable form factors depends less on hardware innovation and more on thoughtful software design, personality, and user education around novel interaction patterns
  • AI tools are enabling founders and operators to move faster on decision-making and creation across all company functions, not just engineering
  • The future of AI assistants will be orchestration layers that integrate with existing tools and APIs rather than standalone applications
Trends
Wearable AI interfaces gaining traction as alternative input method to phones and earbuds for hands-free, private AI interactionCoding productivity becoming the primary battleground for AI assistant market share and revenue, not creative or general-purpose use casesHardware startups experiencing resurgence after years of dormancy, driven by new AI capabilities and consumer demand for novel interfacesAI-powered agent orchestration becoming core product strategy for major AI labs, enabling complex multi-step task executionFounder-led product design philosophy emphasizing clear values and beliefs as decision-making framework in competitive AI hardware marketNeural interface research transitioning from medical/accessibility applications toward consumer wearables for thought capture and expressionMCP (Model Context Protocol) emerging as standard for connecting AI assistants to external tools and data sourcesConsumer expectations shifting from AI as separate tool/assistant toward AI as transparent extension of human cognition and capabilityHardware companies building complete software stacks in-house rather than relying on OS-level APIs, driven by unique interaction requirementsBluetooth and wireless connectivity remaining significant technical constraint for always-on wearable AI devices despite protocol maturity
Topics
AI Wearable Interfaces and Form FactorsVoice as Primary AI Input MethodNeural Interfaces and Brain-Computer InteractionOpenAI vs Anthropic Competitive DynamicsAI Coding Assistants and Developer ProductivityHardware Product Design PhilosophyAI Memory and Personalization in AssistantsWearable Haptic Feedback DesignAI Agent Orchestration and Tool IntegrationConsumer Privacy in Always-On WearablesBluetooth and Wireless Connectivity ChallengesFounder-Led Product Strategy in AIAI as Extension vs Assistant ParadigmHardware Startup Fundraising and Team BuildingUser Education for Novel Interaction Patterns
Companies
Sandbar
AI wearable startup building smart ring that captures thoughts and responds in user's own voice; featured guest company
OpenAI
Shelving Sora to focus Codex coding assistant as response to Claude's market traction; strategic pivot toward product...
Anthropic
Claude and Claude Code gaining significant market share in AI coding productivity, forcing OpenAI to refocus product ...
Meta
Acquired Dreamer, an AI agent platform; developing Ray-Ban smart glasses with neural band for wearable computing
Apple
Referenced for Vision Pro launch strategy, hardware design philosophy under Steve Jobs, and wearable product ecosystem
Control Labs
Neural interface startup acquired by Meta; developed peripheral neural interface technology that became Meta's neural...
Fitbit
Health wearable company; Sandbar's head of hardware previously led hardware at Fitbit
Peloton
Connected fitness company; Sandbar's head of hardware previously led hardware at Peloton
Kernel
Brain implant company where Mina Thami briefly interned on neural interface research
MIT Media Lab
Research institution where Mina Thami conducted neural interface research in undergrad
Oura
Health-focused smart ring company; potential competitor to Sandbar though focused on different use cases
Dreamer
AI agent platform for non-technical users; acquired by Meta in private beta after month of operation
Replit
Code execution platform connected via MCP to Claude and other AI assistants for agent orchestration
Figma
Design tool opened via MCP to Claude for AI-assisted design workflows
Shopify
E-commerce platform; sponsor of the episode
People
Mina Thami
Building AI-powered smart ring with voice interface and personal memory; previously worked on neural interfaces at Me...
Alex Heath
Co-host of ACCESS podcast; visited OpenAI and interviewed Peter Steinberger about Codex product strategy
Ellis Hamburger
Co-host of ACCESS podcast; discussed AI's impact on his writing and reporting workflow
Greg Brockman
Leading OpenAI's Codex super app initiative to compete with Claude; hosted dinner with Alex Heath at original OpenAI ...
Peter Steinberger
Recently joined OpenAI; discussed first month working on Codex product and competitive response to Anthropic
Fiji
Leading Codex super app product strategy; discussed Sora sunset and pivot toward coding productivity
Brian Johnson
Brain implant company where Mina Thami interned on neural interface research
Steve Jobs
Referenced for product design philosophy and high bar for shipping consumer hardware products
Sam
Discussed MCP integration with Replit and granola in context of AI agent orchestration
Eugenia
Described mini apps as containers for prompts and user interfaces for AI workflows
David
Was scheduled to appear on ACCESS podcast but deal with Meta was announced before episode aired
Josh Miller
Referenced as example of founder having insights during vacation that led to product 2.0 strategy
Evan Spiegel
Referenced as example of founder having insights during vacation that led to product strategy evolution
Quotes
"The future of AI might literally be on your finger"
Alex HeathOpening segment
"I wanted to build something that would be valuable when we reached general intelligence. To me, that looked like meeting people where they are, even after general intelligence, we'll be taking walks, we'll be thinking, we'll be remembering, we'll have things that we need to do."
Mina ThamiMid-episode
"If you're serious about this being an extension of you, it should be your own voice"
Mina Thami (quoting feedback)Product design discussion
"We're not going to be thinking of these chatbots like kind of the basic, you know, get put some text in, get some text back, or maybe some visual chatbots. We're going to be thinking of them in six to nine months as like agent orchestrators."
Alex HeathOpenAI strategy discussion
"All thoughts are beautiful. Whether it's a grocery list, whether it's a heartfelt letter, whether it's some important interview you're prepping for, the fact that you are having those thoughts, I think, is what makes it beautiful."
Mina ThamiPhilosophy discussion
Full Transcript
At EDF, we don't just encourage you to use less electricity, we actually reward you for it. That's why when you use less during peak times on weekdays, we give you free electricity on Sundays. How you use it is up to you. EDF. Change is in our power. Where does President Trump's speech leave us with regard to where the war is headed? And it really was, to me, the story of the Commander-in-Chief who weeks into this war is deeply uncertain about how it ends. I'm John Feiner, co-host of the Long Game Podcast. This week, Jake Sullivan and I break down the President's speech and discuss what it's like to negotiate with the Iranians. We will also debate whether Iran should accept a deal. The episode is out now. Watch and follow the Long Game wherever you get your podcasts. Can you tell me about the Access podcast? Yeah, Access is a tech industry podcast hosted by Alex Heath and Ellis Hamburger. They interview big players and emerging founders, kind of the inside baseball of Silicon Valley. Okay, this product is dope. This week on the show, we're sitting down with Mina Thami, the co-founder of Sandbar. It's a startup building what might be the most interesting AI wearable you've never heard of, a smart ring that doesn't just capture your thoughts but talks back to you in your own voice. We get a live demo of the hardware. We dig into Mina's background and neural interfaces at MIT and Meta. And then we talk about why the future of AI might literally be on your finger. Plus, Ellis and I get into what's happening inside open AI right now and why the race between ChatGPT and Claude is about to get a lot more interesting. Welcome to Access. Welcome to Access. I am Ellis Hamburger here with my co-host Alex Heath. What's going on, my friend? I'm good, man. You are actually in Mexico right now. Technically. Back to the future. Using a Rift and the Space Time continuum. By the time this episode comes out, you will have just come back from hopefully a very relaxing beach vacation. I sure hope so. Because I think ever since I started meeting a few years ago and going through this renovation that went a lot over budget and et cetera, et cetera kids, I haven't really had an actual vacation in like years aside from like family commitments. So yeah, I'm hopeful that this Cabo resort will literally do everything for me and everything for us. We're going to have some recreational drugs. What are we going to be doing down there? Recreational Miami Vice cocktails. Yes. I mean, do you unplug when you're actually on vacation? It's hard, man. It takes me, I found, especially since starting a company, it takes me almost two days to fully let myself relax. So when you do the like short weekend or long weekend getaway, which I've done in Cabo several times, I'm usually relaxed by, you know, the last half of the day before I leave. Where is the phone located is the question? Because I feel like I went on this nature retreat during college for like seven weeks. That was just the absolute best. And we had no technology or electricity and it probably took like four or five days to stop feeling those phantom buzzes on your leg from the phone vibration and even longer to like stop reaching for my phone every time I had a question that I wanted to Google or something like that. You just had to be like, oh, well, I guess I'll show you the song in eight weeks. Yeah. The phone is definitely near me. I'm usually because I catch up on podcasts when I'm on vacation. I like to listen for in a more like active way where I'm actually like thinking and then it becomes work because I'm thinking about, oh, I like this idea. I like the way they did this. We should do this. I think I was literally texting you like pod ideas, probably stoned at the pool in Cabo last time I was there. But that's what all good founders say. I mean, this is the same with with Josh Miller at the browser company or Evan Spiegel at Snapchat. They go on vacation and then they think of what they want to do for their 2.0. Or this or that. And they're like, why don't I do this more often? And it's like, yeah, I mean, being away from everything gives you the free time to actually let your subconscious do its thing. And I mean, I guess that's why you want the stream ring so bad. Exactly. We've got Mina, the founder of Sandbar and this very cool ring, which I think we've talked about on this show before. Maybe we have to that. Getting ready to come out soon. And we both know Mina have hung out with him. He came and showed me the original prototype or not the original, but one of the earlier prototypes actually right by your house in LA at a cafe about a year ago. And it's cool to see the progression. They've raised a lot of money. They've got a great team and really unique, interesting take on AI on your on your finger. I don't know. He's a very thoughtful dude. Yeah. Yeah. I love a good love a good design founder. Clearly one of those founders who's just truly thinking deeply about the impact of their product on society. And I mean, I meet with founders every single day and, you know, a lot of them are just kind of here for fun. Want to build stuff. But yeah, they seem to have the full package across hardware. He talks on the show about how they've got folks from Apple, from Fitbit, from Peloton. And then on the software side as well. It's very cool to watch. I mean, it's been an absolute joy as a gadget nerd to see kind of the rise again of hardware startups, which we, I feel like we, we didn't see for quite a while. But yeah, they're back in full force. There's a lot of noise out there, but Mina's a pretty quiet dude and they've not made huge splashes, but I'm very actually bullish on this form factor. As we talk about its heart, it's been, I've been thinking a lot about it and how I want it since I first tried it. But yeah, man, before we get to that, I guess the only thing I've got is by the time this comes out, it'll have been about a week since this, but yesterday I spent a chunk of the day at OpenAI with a lot of their leadership, especially on the Codex, their, you know, Cloud Code competitor side, and then went to a pretty cool dinner with the Codex team and Greg Brockman, the co-founder of OpenAI at their original office, which was cool in the library where they did a lot of earlier research work. Sushi dinner there in the library, which was cool. The dinner was off the record, but I had a bunch of interesting on the record conversations with them and including Peter Steinberger, the creator of Open Claw, the claw father, who is about three weeks into being at OpenAI. So he told me kind of what he's working on and what he's thinking about his first year. Is this guy your current crush in the tech industry if you had to choose? Yeah, Peter's an awesome dude. He's very, he's very genuine. He's just a old school nerd who's in it for nerdy reasons. And it's very clear when you're talking to him. Did you touch his bicep? He's huge. He's jacked. Yeah, I felt scared to do that. But it was cool, man. It was this first interview since I officially joined joining and that that's going to be out on sources. And then also by the time this comes out, my broader piece about what is really happening is like codex is taking over that entire product suite. Right. Like they killed Sora, you know, the week before this episode comes out. And that's part of them focusing all their efforts and GPUs, most importantly, on building this super app to counter what Anthropoc has done with Claude and Claude co-work, which has really abstracted away coding for normies where you are coding, you're using AI to execute code and crawl databases and all this stuff, but you don't need to know that. You can look at it, but you don't. And that's a very powerful thing. I mean, I use it every day all the time. I know you use it. And what chat is going to become is that essentially, and I think that's actually the future of all these assistants that we're going to be talking about in probably more six is six ish month timeframe is when this is going to really start catching on beyond the, you know, early adopter crowd that's listening to this show. But it's going to be really interesting when you start building these coding harnesses in with the consumer chatbot experience. And that is very much underway at Open AI as a response to what Anthropoc is doing. Well, I mean, you know, the personalities involved here. I mean, is this wanting to get back on the hype cycle? Is this seeing the financial opportunity with Claude integration at companies and how much bigger that is than something like Sora? Or is this an actual change of heart? Because, I mean, as we talk about with Mina, you know, your products kind of and your product strategy represent your beliefs about what you're trying to do. And I mean, whether, you know, when we talked with Fiji, whether it was about Sora and creativity or pulse and personal utility and getting these different, you know, news sources fed to you every day. I mean, has there been a change of heart there about what the company is about? I think they regret not being so good at coding so early as they could have been. Anthropic focusing on coding has really given it this takeoff moment that it's in right now and the UI of Claude co-work making it easy for people. I think they wish they'd gotten to that first. I don't think it's a change of heart. I think it's seeing, oh, this is taking off. The models are also now capable of doing agentic things reliably and or at least most of the time reliably. And so the moment is now. And I think they're looking at the growth of Claude code and going, you know, yikes, we need to combat that. And I mean, these companies do believe that coding is a way to get to AGI, which is very, that's definitely not a pivot. They've been trying to get to that. But I do think Open Eyes is realizing that the money and the gains and the hype and the good vibes are in this coding productivity world right now and not everything else, not adult mode, which they told me exclusively, right, that they're shelving Sora, which they killed the kind of getting rid of what they're calling side quests to focus on this super app, which Brockman, the co-founder is kind of leading with Fiji. And it means that codex is going to be the harness, literally, but also just kind of enveloping ChadGPT in the same way that Claude code is taking over Claude. Yeah. I mean, we'll see. You know, I think if you're following the trends and the hype and the crowd, I think that is arguably how you end up in the kind of situation that met us in where they go on these shopping sprees every two years and buy up all the latest trends or whatever. I don't think it's that because everybody was saying everybody was saying that Sora was going to be the next big thing and everybody's going to be making AI videos like with full conviction. And then it just kind of didn't happen. I mean, Sora, Fiji, talked to us about that. Like it was an early thing and it wasn't growing. I think the way that they maybe had hoped it would continue to grow after it had that initial pop and was at the top of the app store. Meanwhile, people are using the shit out of these AI coding tools and Claude co-work, right? And like you can do some of that stuff in codex, but there's an awareness problem where people don't know that codex is capable of Claude, you know, co-works abilities. And I think they've got to build that into the chat interface, which is, you know, reaching almost a billion people, which is a huge move. And so basically, I just think big picture, like we're not going to be thinking of these chatbots like kind of the basic, you know, get put some text in, get some text back, or maybe some visual chatbots. We're going to be thinking of them in six to nine months as like agent orchestrators. And that's going to introduce this capability to a lot more people very quickly. Yeah. And where do you feel like it's going to net out that kind of distinction between the actual apps that you might need? Like whether it's with that company Dreamer that was acquired by Meta or something like Wabi, or even what some of the others like Repletor creating with your own applications. Like, do you think those are going to be a part of this same linear, you know, model, or are they going to be separate? Because it's a lot of the same features, right? Some of them just have interfaces and some don't in terms of the orchestration. I think a lot of the vibe coding tools, Repletor, whatever, are already connected via MCP to, you know, the big assistants. I think the assistants, when I say that, I mean chat, GPT and cloud, mainly, but Gemini as well, are going to be these orchestration layers that have all your contacts, your memory hooks into your data and your tools, which are Figma, which is like really opened up via MCP to cloud or, you know, Replet, which, you know, I was seeing with the granola MCP, you can literally like just have, you know, we were talking about that last week with Sam, right? You can have Replet spin up something, you know, via cloud as your granola is coming or whatever. Like, I think we're going to start thinking about apps really differently in the next few months. And I think it's going to happen faster than people think, based on what the labs are seeing. I mean, it's just interesting and kind of fuzzy because, I mean, Eugenia from Wabi has, you know, described a lot of these mini apps as kind of like containers for prompts for things that just often are better as a user interface. Like for example, a photo gallery of AI generations is great as a little app or user interface. And I think if you look at Dreamer, as far as I could tell, under the hood, it's a lot of connections being made between all these different services that all themselves have APIs and MCPs. And that's something that you could theoretically just ask Claude in a year to do. But I don't know, when you try to, I mean, how important were the interfaces for those things? Or was it more just about templated orchestrations? Yeah, Dreamer is a really cool platform sold to Meta in a month of being in private beta. I got added about a week and a half before it sold. A little inside baseball for how quick this Meta deal came together is the CEO was asking to be on this pod like a few days before the deal was announced. So I don't think they knew this was coming. It's a great team. A lot of early Android, Stripe, Meta people, and they were building agents for normies, they were building a marketplace as well, where you could remix other people's agents and get rev share from how they use it, which I thought was really cool. And it was an agent open claw ish type thing that like, you know, your parent couldn't break, which I thought was cool. And the way they had all these interesting data sources that I haven't seen in other chatbots live game data from, you know, sports leagues, stuff like that was was a really unique approach. And the interface was really simple. And clearly Meta thinks that that's something they need to kind of build into their platform, whether it's for their businesses that advertise or end users. So yeah, man, it would have been cool to see Dreamer keep developing and, you know, have have David the CEO on. But they went and got those Zuckbox. Good for them. That's funny. That happened to me as well. A million years ago, I was all set. No, I was all set to do a big, I forget if it was an exclusive or what with the creators of Vine. And then when I was at the verge and then they completely ghosted me, which like pretty much never, ever happened in which I'm maybe even still but heard about and I was like, where did they go? This was really cool. You know, and then a few weeks later, uh, yeah, the announcement that Twitter required them and I was like, Oh, that's why. Yeah, this happens every now and then where a founder goes silent. Yeah. And you're like, that's why. Yeah. Uh, yeah. That's all that's on my mind right now. Uh, I hope you are enjoying your vacation this week. Uh, I'm going to manifest it from today's recording to where I will be. Yes. You're going to be so relaxed when you come back. And, uh, yeah, man, I guess with that, we should probably kick it to Mina. Cool. Ready to launch your business. Get started with the commerce platform made for entrepreneurs. Shopify is specially designed to help you start, run and grow your business with easy customizable themes that let you build your brand marketing tools that get your products out there. Integrated shipping solutions that actually save you time from startups to scale ups online in person and on the go. Shopify is made for entrepreneurs like you. Sign up for your $1 a month trial at Shopify.com slash setup. Hi, I'm Brene Brown and I'm Adam Grant. And we're here to invite you to the curiosity shop, a podcast that's a place for listening, wondering, thinking, feeling and questioning. It's going to be fun. We rarely agree, but we almost never disagree. And we're always learning. That's true. You can subscribe to the curiosity shop on YouTube or follow in your favorite podcast app to automatically receive new episodes every Thursday. Mina, what's going on, boss? Good to see you. How are you? Good to see you. Is this your headquarters you're coming to us from? It is indeed. Sunny Flatiron. You got some nice lighting in there, some skylight. We were in a basement for about a year. And so our single requirement for a new office was as much sunlight as humanly possible. How does it feel to be a certified ring guy now? You're one of those designing the future of rings. Everybody's got to take. I wore rings a bit actually before Sandbar, but I have stopped effectively wearing anything except for our prototype by this point. I think I may have just gotten tired of wearing things. Well, I find it really interesting whenever I meet someone, I try to ask why they're wearing the jewelry that they're wearing. And there's usually very deep and personal reasons, like whether it's a bracelet, a necklace, a ring, it'll have been a gift from a loved one, or it'll be a reminder for a period of life, or it'll be some strange journey they went on where they wandered into a place and picked something off a shelf. It's very rare that I meet someone and there isn't some very deep reason for what they're wearing. So this is your version of the Tik Tokers who go up and go, you look so confident. But you go up to them and say, why are you wearing that necklace? That and ideally no camera. It's certainly not a tough sell to us ring bros in LA. Everybody's got a signet ring or some type of woo woo spiritual jewelry. I'm naked. No bling. I mean, I got a wedding ring, you know, but, you know, I'm not taking that thing off, but I'm going naked otherwise on the fingers. Hey, that's a that's a that's a very meaningful ring to be wearing until I get your ring, which is soon, right? Yes. Yes, we'll be shipping in the summer. And we're actually going to be racing towards beta sooner than that. We're aiming for beginning beta in April, where we'll begin distributing more prototypes to more folks to try out. Yes. So like walk us through what's it been like to actually wear it every day? How long have you been wearing it every day now? Maybe around two years. It depends on we've had a bunch of generations. So I brought with me some of the older prototypes. This was the first one. So you can actually see it's for our audio listeners. He looks like he's holding something that like the optometrist would put in front of your eyes, maybe to like test your eyesight is what this looks like. Poetry. Or I would say like an ankle bracelet when you're on house arrest or something like that, but for your finger, very stylish. Yeah, I was wearing it a lot on the high line on the west side of Manhattan, where like people will go out in their full regalia. And you know, this was meant to be a bit of a fashion statement. But when we've had this prototype and we were trying things out, the use cases looked very different. We had it hooked up to search APIs. We had it hooked up to navigation. And we were mostly using it as kind of an urban exploration product. And it was only when we began exploring conversation and notes in the moment that we felt the use case begin to click together. So after a few more iterations, we started to give this out actually in what we called alpha to folks to test. So this version also had a gap on the bottom and eventually we closed it and got to this version. And that's where we saw people beginning to use it in different ways. Some people use it as a personal CRM where like after meetings, they'd say note about someone. No CRMs on the pod. Band. That's how I'm going to use it. Let's not say CRM. Let's say the network of people you care about in your life. Okay, fair. Other people would use it while they were driving to prep for, in one case a professor would use it to prep for a class that she was going to be talking through. So a lot of the magic of stream is conversation. It doesn't just capture what you say it remembers and will actually talk back to you in your own voice. So you can use it as a thinking partner. And that's where we saw use cases like, you know, planning a lesson plan, prepping for a meeting, talking about an essay you were writing, begin to really blossom off. And recently we've been experimenting even farther, which is, you know, now LLMs don't just respond, they can actually act. And so it would mean if wherever you could you were, you could just fire off in action and have something start going. So that's a lot of what we've been now playing around with. Oh, I didn't even think about that since you started building this, especially in the last like six months. Yeah, the agent stuff has really taken off. Has that just changed the roadmap for the app? I would say it's opened it up quite a lot. But also like so many temptations, right? Every founder just wants to do trip planning for their gadget or their AI, right? Now you technically can. That's true. I mean, I'd say we really try to just start with prototypes we try on ourselves and people close to us and see if it actually is useful. So, and we're starting with some hypotheses that we hope are generally useful, regardless of what form agents or other technologies take. When we were starting, we believed that there was no interface to make use of LLMs in the moment. And eventually felt it would require some mix of voice, of personal privacy, of immediacy, of control over the cadence of the conversation, as well as a lot of work on model-centric things like latency, memory, personality, tool calling and routing. And that's kind of agnostic to what you're trying to do with the AI. You need a personality that can present information concisely and helpfully. You need memory that can maintain context over multiple conversations. And now with agents, we see that growing into a new set of primitives that can be layered on, that can allow you to talk through and take actions on basically whatever thing you want to use an agent for. I saw you tweeted new hardware plus software will finally unlock telepathy. Maybe this was a while ago, but it sounds like that's the vision here. That's the dream. Most of my background was in neural interfaces. It was a bit of a winding path to consumer hardware. And for those who don't know, explain neural interfaces because it is kind of as freaky as it sounds. For me, neural interfaces were always about expressing yourself fully. There are all kinds of thoughts that race through our minds, all kinds of emotions and experiences. We have tools to share our full experience with other people like creating art, things that can't easily be put into language. I thought neural interfaces could be beautiful because they could allow us to express even more completely than that. So in undergrad, I spent a lot of time doing research at the MIT Media Lab and actually briefly interned for Brian Johnson at Kernel when they were doing brain implants and then moved to New York to join a startup called Control Labs that was developing a peripheral neural interface for computer control. Through that, joined Meta and was working on a lot of deep learning models and wearables there that eventually led to Sandbar. Yeah. And for people who don't know about Control Labs, it was the foundation of the band, the neural band that is connected to Meta's new display glasses, which I have and honestly have been very impressed by at every phase of seeing that band develop over the years inside Meta and the haptics of it are what's also really incredible. It's not technically reading your mind, but it's interpreting impulses, pulses, right? How would you say it? You can say it better than me. It's interpreting pulses from your muscles in your arm, is that right? That's right. So when you move, your brain sends a signal down your nervous system and when that signal hits a muscle, the muscle amplifies the signal significantly and that creates an electromagnetic field. And then we had sensors in a wristband that could read the electromagnetic field and then we used deep learning models trained on really incredible scale to interpret those electromagnetic fields into inputs. So you could do something as subtle as writing with your fingertip on a table and we could reconstruct what you had written as an easier way to write on the go. Ellis tried this in Vegas. Last time we saw you was during CES. We met up on the casino floor. I have a photo of Ellis wearing the glasses and using the band. Ellis, do you remember it? I do, but I mean, I'm just kind of getting a little spooky feelings right now. I'm reading the original Ghost in the Shell manga. Have you ever read it, Mina? I haven't read it, but huge fan of the movie. Yeah, the movie's incredible. And you know, there's just this whole idea that in the future, everybody kind of has a cyborg cyber brain and a lot of people even have interchangeable bodies. And it seems like is that not what you were literally building toward, perhaps not in terms of ethos, but like if we could read all those electrical signals, you could pass those along to electronic prosthetics theoretically. How many years out? I got to experience some of this really personally in prior work. The lab that I was working in at the media lab was focused on neural prostheses. Prosthetic limbs for individuals who had lost a limb and needed to regain mobility. Before that, I had briefly interned at the Applied Physics Lab that was similarly doing neuro prosthetics, but for arms. I really think of this technology in terms of giving people freedom. I think that there is like, there are a lot of challenging questions around like, what do we want our relationship with technology to be? A core belief that I've developed through these experiences is that the best relationship comes when technology acts as if it is an extension of you, rather than a companion or an assistant or replacement or even necessarily a tool. There is a little bit of uncanniness at the start of that, which is, well, what should my relationship to this self-like thing be? Is it me? Is it not be? I think that's where some of the, perhaps what you're describing, Ellis, that question comes from. When something has your context acts with your goals and is a literal extension of what you want to have happen, I think you eliminate a lot of the gap that comes with something that is other than you that creates a tension where perhaps the goals are misaligned, where perhaps the context isn't quite right and you're having to over explain yourself. To me, whether it be a ring or a neural interface, the goal really is about letting you express yourself very easily. That's come through very clearly in the product, I think. When you showed it to me, God, whenever that was, that was a while ago, and Los Feliz, and I love you brought the case with all the different rings. Then when we were talking about it in Vegas with Ellis, the way you've designed it, the intentionality of you have to hold it up almost like you're a Secret Service member, talking into their sleeve and prompt it to engage with you. You have that haptic feedback of it vibrates, lets you know that it's listening, versus being always on or chiming in, which is even more assertive for an AI, and a lot of hardware products are going that direction. You're viewing the ring as something that the human prompts and fully controls. I think that having a strong belief or a North Star really helps with product design, whether software or hardware or interfaces. I'll often chat with other founders building software or hardware who are perhaps struggling to make decisions one way or another. I think if you can plan to flag and say, this is what I believe in, and allow that flag to dictate choices to you, it makes a lot of things much easier. For example, putting haptics into the ring was not easy, but we knew in order to give that feeling of something being an extension of you, we would need to tap into many more of the body senses. Providing haptics is a great way of closing the loop that what you've said has been heard, and potentially even silently prompting you if, for example, reminders do, or perhaps in the future if some action taking place needs a clarification. Similarly, originally, Stream's voice was just an off-the-shelf voice, but it didn't feel quite right. And I remember Kroc said to me, if you're serious about this being an extension of you, it should be your own voice. And I thought, oh, it's going to be so weird. And we tried it out. And for the first day, it was kind of uncanny. And then by the second day, everyone on the team just really loved that being their inner voice. And as we've continued to roll it out, we've been surprised at how many people seem to prefer it. I need to use it more, but that kind of freaks me out. I'm not going to lie. When you had it talk like me. Do you guys hear your voice in your head when you think or plan things? So I don't think I hear my voice. Do some people? Is that a thing? So I know some people can't visualize things in their head. I think it's like maybe 30% or so. I might be getting that stat wrong, but don't hear. And then a similar percent perhaps don't see. And then some people don't see or hear in their head. Ultimately, self-expansion dictates a lot of things to us about our direction. One is that that default voice is an inner voice. But another core value of self-expansion is self-autonomy. You should get to make choices. And we as a company should dictate as few of those choices to you as possible. So you'll always be able to swap off to another voice if you prefer that. As long as it's not my grandma's ways voice that she uses, which is like the voice of a British butler who's like, please make a U-turn promptly. And it's kind of gravelly. Just do better than that. That's not the pick for you? Not the pick for me. That doesn't help me do my best thinking. But speaking of kind of like, I mean, there's clearly such a naturalistic vibe to what you're building. And one of the things I often struggle with personally is when I'm walking outside or sitting outside, we all have the tendency to pick up our phone and see what's going on or check our notifications or whatever. But I've been trying to actively not use my phone while walking and just kind of take in the sights. How do you guys think about that as you make technology even more readily accessible? And I mean, obviously, it's not any more accessible necessarily than something like an Apple Watch. But it's clear that you want to maybe accelerate this idea that people are putting their ring up to their mouth and logging some thoughts while they're walking or being outside. I honestly love how in flow I get to be using stream. A lot of the initial motivation to even work on this problem was that I walk everywhere in New York and I wanted a way to capture thoughts or talk through ideas without interrupting. And for me, a phone wasn't quite right because you'd have to stop and pull it out. Earbuds weren't quite right because there's really no easy way to invoke them. And it's kind of embarrassing to project out into the void. No, you say, hey, Siri, please take a note. You want to call your ex dialing now? Actually, that reminds me of we didn't entirely top down in tennis from the start, but a strong intuition we had early on was that the ring would need to be capable of both conversation and music control. Because if you were to do that interaction you just described, what you'd immediately then want to do is resume whatever music you were listening to. So the nice thing about stream is I could, for example, say, can you tell me a little bit about the access podcast? Sorry, I'm having a connection issue. And true demo passion. True demo passion. We try that over time. That was perfect comedic timing. You know it's the real demo, what it bugs. I wish it was in your voice though as well. It should be. Can you hear me? There we go. It's loud and clear. Are you doing okay? Can you tell me about the access podcast? Yeah, access is a tech industry podcast hosted by Alex Heath and Ellis Hamburger. They interview big players and emerging founders, kind of the inside baseball of Silicon Valley. And then if I wanted to resume whatever I was listening to, I could just tap again and it'll resume. That might not come through on the mic, but I can also swipe to control my volume. And that lets me just stay in flow if I had been walking or biking or driving and listening to a podcast or a song like access. Okay, this product is dope. This is too good to pass up. Can you ask it what you've been worried about lately? So this is actually a demo account because I intentionally am too embarrassed to show my actual stream. It will pull up things from a synthetic Mina's memories. So if you zoom out, you can see things in the past. We can see what it thinks the synthetic Mina was worried about. Can you tell me some of what I've been thinking about and maybe worried about recently? Yeah, from how our talks have gone, a few threads run through things. You've been circling around creative authenticity, pressure to perform, needing quiet trust in your own pace, and wanting steady or ground beneath the work. Does that fit? Okay, so this is the ultimate performative mail. If you've heard of the performative mail means on the subway, I think, bro, narrating his personal inner monologue into his ring. Ask it to prep you for this because you asked it about the podcast. Say, can you prep me to be on it? I'm about to go on the access podcast. Some might say I'm already on it, but can you help me figure out how to prep and maybe even talk about you? That's a cool moment. If you mean literally tap, just a single tap on the ring will wake or pause me. For talking about me, what angle are you thinking? Tech design, philosophy, or relationship? I see. It's pretty steerable. It's not going to come at you with a huge long thing. It's going to ask you for how to guide it. That's right. That's a lot of the work we do on its personality. We want to present information very clearly, and we want to have it ask questions that get you thinking. But otherwise, what we say is we want personality without identity. It shouldn't have opinions. It shouldn't have goals. Ultimately, down the line, it should be alterable by you if you would like to give it those things. But it turns out it's actually relatively hard to get a model to behave that way at the box. So we've had to build a lot of infrastructure for creating, like in this account, synthetic users that Stream interacts with. And then we have judges that basically look at interactions with respect to the goals we have for the personality. Since the app has gotten more full-featured, are you using chat GPT or Android Cloud less? That's a good question. I've noticed on the go, effectively, all of my usage has gone on to Stream, and I no longer use other apps, like AI apps or notes apps. On desktop, I continue to use them, definitely. And I think ultimately, our goal really isn't to compete with chat GPT or Cloud. We're trying to present a relatively flexible interface at the end of the day, and we're starting with some specific things that you can't really easily do on the go that hopefully we can offer to you more easily. But if you'd like to use this interface in other ways, as a direct input to those AIs or for other applications, that also comes from our self-expansion value dictating back to us. So, for example, we recently shared a sign-up form for our private SDK alpha, where we're looking for developers that would like to try building on top of Stream, so we can basically develop that SDK together and let other people use this how they will. Could you do an MCP-type thing and just let it pipe into my cloud that way or whatever? Could I just connect the MCP? I think the short answer is yes, but a lot of things break really quickly. There are issues with context management, conversation management, latency, a synchrony, a few other things that MCP doesn't solve out of the box that we're trying to make ideally easier for people to plug and play with. I'm picturing in a couple years' time, Alex speed walking through the airport, talking to his stream ring, hey, Mirror World, Alex, water the plants, walk the dogs, make dinner for my lovely wife, make no mistakes. I am going to use this shit out of this thing. I cannot tell you how often I am like, damn, I wish I had this ring on. Ever since you first showed it to me, Mina, it's been one of those things that consistently comes up for me in a way that most demos don't. I was just like, this is such a natural way to quickly remember something, jog a note, prompt, do a quick prompt. Until we have glasses or whatever, or I don't know if the necklace wearable thing will ever really take off, but until we have some easier input into AI that we can wear, there is just all this friction. It's not even about the AI piece, honestly. It's memory recall for me. It's like there's so many times where I'm context switching or I'm walking out of a meeting or an interview and I want to walk in the dogs and I want to remember something and you just don't want to type it. Even with WhisperFlow lately, I've just been dictating more and that's been a help, but man, this is like- Wait, I thought you were using monologue. Are you already on to the next thing? I switched to WhisperFlow. Dude, you have to tell me when these things happen. All respect to Dan and the every team, but yeah, I think WhisperFlow is my jam now. We, in many ways, I think lucked out with the timing of voice as an interface. So funny. I feel like people have been talking about it for 15 years. Finally, it's here. All things are converging. All roads are roaming. One thing that I would say that I haven't seen as much of though is a distinction between voice and conversation. I would say we believe in voice, but we believe even more in conversation because something we've observed is a lot of people see what we're building and say, oh, I'd love to use this to capture. I'd love to use this as a voice input device. But living with it, what you rapidly find is sometimes you need to recall something. Sometimes you need a little bit more information, especially if you're on the go. Sometimes that thing that you're capturing, it's not just a one-on-one thing. You're actually iterating on an idea. That's where having something be able to speak back to you becomes so important. You can, I think, see that in most UIs today. There are very few one-way interfaces. Clawed code will ask for clarifications as it's building. You'll need some way of delivering output back to the user. That's where a lot of the hardest feedback and ML and infrastructure problems end up being. I mean, it's got to be so tempting, what I was saying, to just go the mega agnostic utility route, but it's clear that your heart is in a very specific place with the type of life you want to afford people. I would also imagine that imminently, because you've been able to build it, the factory is route there and there's going to be a flood of AI voice capture clones from China hitting Timu and Amazon and everything within a year or two in all likelihood. Does that make it easier to choose an airward path or do you want to win that battle too, you think? I would say we have a lot of ambition. I think at the end of the day, we are trying to create a new interaction model that becomes the primary interaction model. We'll see whether we get there, but I think getting there depends on making things that are useful that people like. I definitely watch other things that are happening, but none of us have proven that this is the right thing. I think that's still the obligation our team has is to not get distracted, just make sure it's as useful, as beneficial as it can be. I have a lot of faith in our team's ability to do that at high quality, both on the hardware and the software side. Hardware alone, there is a lot that goes into making something that is very reliable and very safe. We have a pretty stacked team. Our head of hardware was the head of hardware for Fitbit and Peloton and Made the Kindle. We have many engineers from Apple and from elsewhere that are oftentimes focused, for example, purely on the battery or focused purely on sensor fusion for the inputs to the ring. On the software side, we're similarly working to build effectively everything down to write above the foundation model from scratch. All of the infra and connectivity and iOS experience and evals and tuning, some of those things I think will be easier to replicate than others. At the end of the day, all that matters is that it's a good product. Is that haptic thing patent pending? I would say we have a lot of IP in motion. Does anybody even do that anymore, Alex? Patents for hardware matter. I don't think they really matter for software, right? Enforcing patents. I mean, you want to spend 10 years in court against Samsung or whatever and spend all your VC money on that? Hell, yeah, you do. We like to devote most of our time to building good products. We've also been working on this for a long time and have learned a lot in the process. I know you've got this launch of the ring, obviously, that you've been working towards, but my understanding is you're thinking about other form factors. Is there other types of wearables that you think are going to actually work? I mean, everyone's trying rings, bracelets, necklaces, pendants, pucks, right? Open ads probably going to do some kind of puck type thing. What do you think is the other compelling form factor out there, glasses? Yeah, I think about it almost in terms of what parts of the I.O. challenge are you solving or are you missing? When we started Sandbar, we were building a few things in parallel and trying them out. What we arrived at was a feeling of, we think the ring solves voice input. It's very fast. It's very private. It allows you to do a push-to-talk control interaction, but it doesn't solve other things. It doesn't solve certain audio contexts. It doesn't solve vision. Frankly, if I had a confident belief in what that thing was, I would say, oh yeah, and we're working on XYZ. We just haven't gotten there. We've tried a lot of things, but those things haven't crossed our bar. All I can say is I do believe that there will be other form factors necessary to solve other parts of the interaction. That's the main thing I miss of Apple under Steve Jobs is that he had a bar. Right? And they test so many things internally as you're doing and you clearly have a bar for when you feel like it's ready. And if you do believe that you understand consumer behavior and you have better taste than anybody else, you ship it exactly when you know it's ready. You know, like how on the iPhone was later than the Trio and the Pocket PC and all that. And to me, that's kind of the most disappointing aspect of like the Vision Pro in hindsight is that it showed that they decided to kind of go with what Wall Street wanted from them. Even knowing full well, I would think internally that most consumers are not ready for this quite yet. It's not light enough quite yet. The software isn't there quite yet. And as soon as it's ready, they can come out with the world's best hardware for it. And yeah, that's my soapbox of the day. I go back and forth on that. I think you learn a lot from things out in the world. And so I'd probably take the other side that I think that they may have released at the right time. Like they demonstrated a lot of new to world capabilities. They got to see a lot of use cases. They got to share their interpretation of what that future could look like. And it typically takes a few revisions for any product, but especially for a consumer hardware product to become mass market. Isn't it at the third one? It's always like the third one is the one that's supposed to hit. I don't remember which Quest model really got mass market or how to. It was about the third one. And it was like same thing with the iPhone. It wasn't until like 3G that like really hockey sticks. That's a pattern with consumer hardware, I think. I know that story. I don't know. People freaking loved their OG iPhones and iPods and AirPods. No, I'm just talking about like the companies that make the hardware feeling good about the product that usually takes getting to Gen 3 is the thing you hear time and time again. Because you learn all the errors in one, you refine them in two, and then three is when you really get to like build on that and ship the thing that you're proud of. Is that fair Amina? I think that's true. But I guess maybe what I'm hearing from you Alice is a distinction between a product becoming mature versus a product being right. Right. I mean, it seems that they were caught off guard by the response and a lot of the commercials were fake. You saw this with Apple intelligence as well with the fake commercials that they pulled. And they just like weren't realistic in any way. And I mean, if you look at the OG iPhone, they thought it was good enough to be out there. They knew it didn't have 3G. They knew it didn't have copy and paste that was very clearly on the roadmap. And it felt like they had their ideas in order about a very strong conviction about, you know, what the future was going to look like. And I can see you guys in your office with the Apple people to Fitbit people being like, well, these glasses are cool, but they're still too big. And so that's that until it's smaller, you know. And that's why I think the Ray-Ban meta thing has been so successful. I mean, we had Alex from from Metta, who you know, on the show. And he was like, the people at Ray-Ban were telling us, giving us super hard constraints that they wouldn't have otherwise had. And I do think that Apple used to have that type of bar for like what might be considered, you know, culturally acceptable. That maybe isn't quite where it used to be. I don't know. I think one benefit that we had from the start was like how unimportant we are. And also being in New York, like we were wearing this and way crazier prototypes just in public from the very start. And no one minded, partly because it's New York, partly because we're unimportant. And a lot of our bar is our own feeling using the prototype or the product. But a lot of our bar is watching and seeing how other people react. And frankly, what got us over the bar to saying we should ship the ring was how many people were telling us, like, sell me this. Take my money. Always a good sign for a startup. And so we were thinking, okay, that probably means it's directionally correct. In 1984, Apple launched maybe the most consequential computer ever. It was not a good computer, particularly there was actually a lot wrong with it. But the Macintosh had all of the right ideas about what computers would become. And it kind of changed everything. This week on Version History, our chat show about the best and worst and most interesting products in tech history, we're telling the story of the Macintosh. And why? Again, despite not being very good, it managed to change everything anyway. That's Version History on YouTube and wherever you get podcasts. This week is the 50th anniversary of Apple. And so this week on the Vergecast, we're taking stock of where Apple is five decades into its existence. How's the company doing? And we also decided to do something slightly ridiculous, which is identify and rank the 50 best Apple products of all time. After a lot of hours of debating, I think we finally got there. That's on the Vergecast this week, along with the state of open AI, as it raises a ton of money, tries to go public and tries to convince you that you also love AI. All that on the Vergecast, wherever you get podcasts. So where do you think wearables have gone right and gone wrong over the last several years? I think one thing that comes to mind, even though Apple has attempted to tamp it down, is just how much of a distraction notifications on Apple Watch are. And I think a lot of people have started to push back on that and even stopped wearing them as a result. They're also just incredibly lame, Ellis. Yeah. What do you think about Mina in that department? One thing I find really interesting is that we've effectively never seen input or interface wearables. You could maybe debate Apple Watch, but it's primarily a health product, as are effectively all other successful wearables. Of course, there's ear bowls and headphones that are really focused on music. But in terms of wearables that give you new computing experiences, we really haven't seen anything like that. I think the meta-display and the neural band was perhaps the first step in that direction. And this is something we talk a lot about internally. We live with stream every day, so we get to feel how intuitive and familiar it is. But the even fact that you can talk into a ring is so outside the realm of what most people have ever seen that's going to take a while for people to get used to that new reality. So I do think that any new wearable needs to spend a lot of time on storytelling, on user education, on making things really accessible. Otherwise, they just remain tech gadgets or toys for a very select few who are willing to jump through the hoops. We need an air horn sound for whenever a guest says storytelling, because then that activates Ellis. Yeah, it is interesting though. It is such a positioning thing, isn't it? I think it's easy to forget that the Apple Watch was not initially positioned as a health product. And I feel like that actually may play into your narrative. It may have been Gen 3 or around then that they started repositioning to health, and that clearly was the right approach. But I mean, I know they also, I don't know if anybody uses it, but doesn't Siri activate when you raise to wake? I don't know if it's by default or not, but... There's no ability to control when Siri activates, but I think that's true. It's spontaneous. That does make me think about how when you look at a wearable or you look at a hardware product, it's easy to miss how much software is required for that experience to make any sense or feel good. The parallel I would give is if I handed you an iPhone one, and it had none of the software required to drive multi-touch or do app switching and pinch to zoom and the background data management, what good would it be? If I were to give you a mouse and a keyboard and a monitor, but I couldn't do windowing, I couldn't do clickable icons, I couldn't do all of the other things that a graphical user interface affords, it would similarly not be a good product. And so we've really tried to treat interface design as everything from the thought popping in your head to whatever you want next. And that's where things like conversation, personality, memory and retrieval, very low latency, asynchronous parallel processing, etc. etc. buzzwords become so important and ends up being where we spend a huge amount of our time. Do you think about, I just saw Aura for example, they're going to IPO probably this year. Do you think about someone like that who's approached it from the health angle, the form factor, the ring form factor, potentially trying to do what you're doing? Is that competition something like an Aura? I think the entirely honest answer is I don't really know. I'm curious to see, but at the end of the day, I really deeply believe that what you believe in dictates the choices you make. Ideally, you know what you believe in. Otherwise, you'll be making choices without understanding the deep why. I think for most wearable companies, their why is really rooted in health. So even if they're adding new features, I expect that to really be the focus. We are really focused on human computer interaction. We are trying to help you think, remember, act wherever you are. Unless someone has the same mission as us, I doubt they'll arrive at the same conclusion. I like that because all these health wearables, they suck. I just stop using them. I'm not going to use the or anymore. I got a whoop like a month ago and I'm already not wearing it. I mean, Ellis, I feel like you've talked about this too. These health wearables exist to just make you feel bad about yourself. You woke up tired and guess what? Your sleep score is a 43. It makes you feel worse. I'm curious, Mina, as clear as you know whether it's the telepathy or the ideation, you are trying to afford people this ability to have bigger thoughts and think clearer. Do you have a sense of what those thoughts and ideas will be? I feel like there's a pretty big hot button issue right now about what ideas are even going to matter that people have in the coming years. There's also a world where you just make it about recording your groceries as you remember them at the fridge, which I know is also a feature that you have. Are you hoping people are going to have these bigger ideas that you're helping string together with the software, as you said? How would you want to get? Oh, go all the way. Follow your heart. Follow your ring finger. Pickle these index finger buttons. One thing I really believe is that all thoughts are beautiful. Whether it's a grocery list, whether it's a heartfelt letter, whether it's some important interview you're prepping for, whether it's a Claude Coden since you're trying to check on in your Mac Mini, the fact that you are having those thoughts, I think, is what makes it beautiful. What we are trying to do is make an interface that doesn't operate under the false assumption that there is a division in our minds between the practical and the personal and the professional. The nice thing in my mind about existing note and conversation tools is that they are the most flexible interfaces we have, and so you can use them in all of these ways. I think this becomes so important as we step into the future. I remember I started working with LLMs in like 21, 22, and then ChatGPT came out and there was a lot of excitement, but there was also a lot of concern about what the future would look like. Something that was very important to me was I wanted to build something that would be valuable when we reached general intelligence. To me, that looked like meeting people where they are, even after general intelligence, we'll be taking walks, we'll be thinking, we'll be remembering, we'll have things that we need to do. And importantly, I wanted to build something that prioritized and extended and augmented humans, because I think we ought to experience the freedom that this new technology affords. One could say a bicycle for the mind. Oh, God. Don't. Also banned. No CRMs on the show, no bicycle for the mind on the show. What we'll finish that with is what I feel pretty palpably day-to-day. I'm curious what it's like in LA, but definitely in New York, I feel people's fear of replacement and fear of how this technology is being portrayed. And I don't really see the human or extension at the core of many of these narratives. And I do believe it's a choice, and that choice comes from the relationship that you design into the product and into the technology. And we can choose to make these things an extension of us. If we do anything, I hope that that is a message that we can contribute to pushing forward at least a bit. So you're hoping people don't deface your billboards? I hope people can take more walks and feel good about themselves. I hope people can look forward to the future. I hope people can wake up and be really happy. I'm certainly not worried about the art and expression world, because I think that this is a tool that a lot of people are going to be able to make a lot more with versus people making less with relatively. And I think there's a very big difference behind, I think there's a very big difference between a song made with Suno that is intended to just kind of play as background music and doesn't really have an opinion or your lived experiences and thoughts within it. There's a big difference between that and a song on Suno that you might see that has like an artist statement about what the person was going for with it. You know, like my friend was able to make a song about his dog that he never would have been able to make before. Like great, and it's hilarious because he made it and it was a side of his expression that I'd never seen before. But I do think there is going to have to be some retraining or different expectations for people of themselves to kind of free them to think in this way, outside of your current lane that the world wants to put you in. I mean, I've definitely seen the beginnings of that. I've seen people who are creating at a volume that is stunning, as well as people who are in creative ruts for a long time. At the end of the day, I personally don't have a lot of judgments or opinions on how people like should conduct their lives. Like I'm not expecting everyone to go off on like zen meditations and be thinking deep thoughts 24 seven and etc. etc. I do think people would benefit from a life that generally affords them more flexibility and freedom to move and think as they prefer or as they choose rather than as they'd have to. Alex, when you've automated everything, is what's left just like your relationships? Yeah. I mean, yeah, it's interesting. I've talked a little bit about this, but I don't really write anymore. Like my stories, I talk to AI and it has all my context. And I talk to it like an editor talking to a reporter crafting a story. So I say, no, that's not good. That's a lazy transition. It's smarter to lead with this angle because this is fresh. Move this graph here. Your kicker sucks. Like I'm training a young reporter, except it's hopefully my AI that gets smarter. Although I will say, Claude Nuppen, I need to work on that memory thing. It's not very good. So it's really changed the zero to one creation process for me and made it more fun to actually like put writing out because I always hated that. I always hated just like looking at a blank page and a bunch of transcripts of interviews and needing to assemble all that and then begin, you know, the process. And this gets me there like immediately. And it does feel empowering. So we're moving to a future where you have Schmooze leverage and that's all you have. Schmooze. Yeah. And like taste leverage to go like ideally, ideally, this is not happening right now. But ideally what this does is it frees me up because my writing process takes less time to consume more to talk to more people, to build more of a bird's eye view of what I'm covering. And then what you're really getting is like my curation of this universe that I'm covering and access to the people I'm talking to, through the pod, through the newsletter. And that to me is my edge, not my prose. I think I've been talking about this a lot, but I think writing will generally go towards very human anti AI, beautiful prose. They honestly gets weirder and weirder and more creative, pushing the boundaries. And then writing that is just a medium or a means to something else, which is like in my case, reporting in some other case, maybe it's like news summary or whatever. And I think everything else in the middle is going to have a really hard time. That's like my working theory right now. I would buy that. I mean, you're each pretty great people to hang out with and that interpersonal, I think dynamic really matters. I think that's part of why I especially like when this technology is framed as self rather than other, because when AI is other and is competing with other humans, it almost confuses what the actual benefit is of someone who is different than you. A lot of why I enjoy talking with you or I enjoy talking with other people is other people challenge me and they have lived experiences that are very different than my own. And those things are pretty hard to get from a machine, but you can kind of be fooled every now and then, which I don't necessarily think is helpful to people. The other thing you're saying that makes me think of is something that Kirak told me when we started working together on this, was how he wanted to program while walking through a forest. And he wondered if what we were building could enable that. And similarly, I wonder what it would look like if you could just take a walk and write a story and whether that would change what the story would look like, because you'd be thinking with your mind and you'd be moving your body. And I'll flag that I don't expect, again, all of our thinking to occur while in some idyllic motion through the wilderness, but I do want people to have more freedom to pick that and to use the interfaces they want, wherever they are in whatever ergonomic or bodily positions that they would prefer to be in, rather than having to be chained to whatever window we find ourselves in. So tell us, how has AI changed how you built and organized the company? I have one funny thing I was thinking of, which is how design crits go when someone's idea is no good, everybody says it's no good, and then they say, but all thoughts are beautiful. How does the company run with AI differently? I can say perhaps how I run and then how I see our team running. I think the two biggest impacts to me is I arrive at decisions much more quickly, and I'm creating a lot more. So I use usually now either stream or claw, but sometimes chat to think through things that my otherwise have required me to write and ponder and internalize for very long. These things are just good at asking me questions that route me to my true intention pretty quickly. The other thing is that many founder, CEO types, most of my life is some mix of hiring, fundraising, etc. But now that I can build with Claude Code and I can create images with image models, there was a good period where like many other people, I was just sleeping very little and just making a lot, and then I eventually had to be like, no, you need to be resting more, but for example, there are some images that we've been posting recently on Instagram that I absolutely love. They're like black and white images of the ring balancing on strange body parts, and those I was able to make using AI image tools. And I'm seeing something pretty similarly with the team. I think a lot of people are talking about the impact of AI on coding, but I'll say it's impacting everything for us. We do everything in-house, including marketing, supply chain, logistics ops, etc. And we had a supply chain review a couple of weeks ago where our head of ops basically made this massive reconfigurable dashboard that showed the lead time of every item that lets you switch between, you know, this is what different tariffs will result, had these different price points. And we had to make some really complicated decisions that this tool made really easy. And our head of hardware watching this was stunned. He was like, this would have taken a team of people weeks to build at his prior companies at Fitbit and at Peloton. And now just our head of ops was able to build that himself. And he emphasized that it was very hard to build, required a ton of going back and forth with Claude to create that dashboard. But we all had this collective moment of like, wow, you can do so much more across everything. I have a good question to close on. Is Bluetooth finally not bad? Because I feel like there were so many gadgets over the years that we're trying to do ambient awareness or speaking to it or like taking notes like whatever it is. But Bluetooth was always the constraint because it was so terrible. Is it better now? It's definitely matured a lot, but it remains pretty hard. For example, getting to something that is always connected is pretty non-trivial still. We have two full time firmware engineers and antenna consultants and connectivity consultants. Oh yeah. You're holding it wrong. Well, it's actually kind of cool seeing the tests that you have to do because you need to check how does the connectivity change based on how many people are in a room, how much interference is in a room. So the underlying protocol is pretty good, but for better or worse, a lot of hardware in general has not necessarily become that much easier. I would say in our case, we actually have some unique challenges, which is compared to other health wearables that are just sending your heart rate. We're sending audio. We're sending a lot more data through a metal housing. So the connectivity becomes even more difficult. And that is why we are taking our time to shift. Is that what you're most jealous of, their U or W chip or whatever it is? Which one? From Apple. That's a good question. So that seems like one of their key advantages with all this wireless stuff that they've done is just building that whole communication pipeline in-house. Yeah. I'd say that's probably part of it. Apple does get to do certain things with their devices that aren't immediately available to other peripherals in iOS, but we're pretty happy with the access that we have, frankly. It's enough for our application. Well, no one thought that was going to be the last question, but I mean, I've been in technology a long time and I think Bluetooth and Wi-Fi are my two least favorite technologies. So we had to end there. I had to know what the latest status was on those so I could properly monitor the situation. Very important. Yeah. So thanks for the info. Yeah, Amina. We really appreciate your time, man. Looking forward to getting a ring. Excited to get you. Yeah. If it isn't completely clear, Alex really, really wants one as soon as possible. He's trying to be polite, but I'm really not. He got in batch one. We've got to get you a beta unit, but I think you got in batch one, right? I'm not sure. We'll figure that out for this part. Batch two is still shipping within the summer. So any listeners who want to get in before the next batch? Got it. All right, Amina. We'll let you go. Appreciate it. Yeah, thank you. That's it for this week's show. Don't forget to like, subscribe, everywhere you get podcasts. We are access.show on the internet and you can find us in video at access pod on YouTube. And if you like this episode, leave us a five star review, whatever the max amount of stars are on the platform that you're using or a thumbs up. And please share this with a friend, share it on your social networks. Really means a lot. We're a new show and we love the word of mouth. As long as five stars have been around, people only ever give one star or five star. Well, give access five stars. I just love that like user interface debate. It's like, let's just go with the thumbs up or thumbs down or something. Okay, Ellis, we're doing the outro. Come on. You could find Alex's really lovely newsletter at sources.news. And you could find me at hamburger on Twitter and at meaning.company. Access is part of the Vox Media podcast network and the show is produced by Hokes creators. Bye. Bye. Bye.