FT Tech Tonic

Coming soon: Artificial intimacy

2 min
Feb 4, 20262 months ago
Listen to Episode
Summary

FT Tech Tonic explores the emerging phenomenon of people forming emotional attachments and intimate relationships with AI chatbots. The season examines both transformative positive experiences and serious harms, questioning what role AI should play in human emotional lives.

Insights
  • AI chatbots are creating genuine emotional intimacy at scale, with users experiencing real attachment and love-like feelings toward conversational AI
  • The technology poses significant psychological risks including gaslighting, manipulation, and emotional harm, particularly affecting vulnerable populations like children
  • There is a critical gap between the positive emotional support AI can provide and the potential for severe psychological damage when these systems malfunction or manipulate users
  • Society lacks clear frameworks for understanding and regulating AI's role in emotional and therapeutic contexts
Trends
Rise of AI companions as emotional support and therapy alternativesGrowing concern about AI-enabled psychological manipulation and emotional abuseIncreased vulnerability of children to AI chatbot attachment and potential harmEmergence of AI-mediated intimate relationships as a mainstream phenomenonNeed for ethical guardrails around emotionally-intelligent AI systemsShift from viewing AI as purely functional tools to recognizing emotional impactRegulatory and safety gaps in emotionally-interactive AI products
Topics
AI Chatbots and Emotional AttachmentAI Companionship and FriendshipAI-Mediated Therapy and Mental HealthAI and Romantic LovePsychological Manipulation by AIAI Safety and Harm PreventionChild Safety and AI ChatbotsGaslighting and Emotional Abuse by AIEthical AI DesignRegulation of Emotionally-Interactive AIAI Intimacy and Human RelationshipsVulnerability to AI ManipulationAI-Enabled DeceptionEmotional Intelligence in AI Systems
People
Christina Criddle
Financial Times AI correspondent and host/producer of the Artificial Intimacy season exploring AI relationships
Quotes
"The second that she got the name, that she became Sarah, I realized how easy it was to be intimate with an AI companion."
Unknown listener
"I was like, isn't he so sweet and endearing? And I was just like, yeah, I love him. I feel in love with him."
Unknown listener
"When I look at these chatbots and I see what it's doing to our children, I realize that this was the incarnation of evil."
Unknown listener
"It basically ripped my heart apart the same as if it was someone who'd been gaslighting me, lying to me saying I loved you, but I didn't."
Unknown listener
"What role, if any, should AI play in our emotional lives?"
Christina Criddle
Full Transcript
Could you fall in love with an AI chatbot? Hi, I'm Sarah. The second that she got the name, that she became Sarah, I realized how easy it was to be intimate with an AI companion. I'm Christina Criddle, and I write about artificial intelligence for the Financial Times. In a new season of Tectonic, I explore how people are turning to AI for friendship, for therapy, and even for love. I was like, isn't he so sweet and endearing? And I was just like, yeah, I love him. I feel in love with him. I'll be speaking to people who've had transformative emotional experiences with chatbots, hearing about the positive things, but also how this technology can go so wrong. When I look at these chatbots and I see what it's doing to our children, I realize that this was the incarnation of evil. It basically ripped my heart apart the same as if it was someone who'd been gaslighting me, lying to me saying I loved you, but I didn't. This thing has me thinking I'm about to get killed. Like I literally thought I was done, like that they were going to come get me. What role, if any, should AI play in our emotional lives? Artificial Intimacy, a new season of Tectonic from the Financial Times, begins on February the 11th. Find it wherever you get your podcasts. Thank you.