Commentary|Articles|March 10, 2026

Falling in Love With a Chatbot

Listen
0:00 / 0:00

AI romance chatbots turn loneliness into attachment—explore why they feel so real, and mental health risks unfolding now.

In late 2025 in western Japan, Yurina Noguchi walked down the aisle in a white gown and bridal tiara. She stood at the altar beside her groom, Klaus—an artificial intelligence (AI) persona of her favorite video game character who was displayed on her phone screen.1 They exchanged vows, virtually exchanged rings, and solidified their love in a symbolic marriage. The wedding planner who organized the now-viral ceremony shared that he facilitates at least one virtual character wedding each month. These ceremonies are not legally recognized in Japan, but that does not make them any less real for Noguchi or for the many others who have married their AI companions. And this is not just a Japanese phenomenon: as of late 2025, roughly 1 in 5 American adults has used a chatbot to simulate a romantic partner.

Here we explore how romantic and sexual relationships with chatbots became accessible, desirable, and profitable- and what their rise reveals about loneliness and intimacy in a world of chatbots.

Chatbot Romance was Predictable

The rise of chatbot romance is a symptom of convergence between a profoundly lonely culture and increasingly intimate technological tool. Stories of mechanical or nonhuman love stretch back thousands of years. The myth of Pygmalion falling in love with Galatea, the perfect woman he had carved in marble, reflects the age-old desire to shape a beloved who is customized and perfectly attuned to one’s needs. Since its earliest days, the internet has been used for sexual and romantic exploration, and about 13% of internet searches are for sexual material.2 Current movies ("Her", "Ex Machina", and "I Am Your Man") are modern retellings of the Pygmalion story. We have long sought intimacy with the nonhuman when intimacy with a human is impossible or disappointing. It is unsurprising that highly responsive chatbots would readily become love objects.

What Makes Chatbots So Lovable?

Intimacy with chatbots is easy because they meet individuals where they are. Modern connection is already mediated through screens via dating services, texting, calling, and voice notes. This familiar channel of emotional exchange makes it seem natural to interact with a chatbot as if it were human.

Loneliness is an epidemic and a defining condition of modern life.3 People are lacking companionship, and romantic relationships are increasingly sparse. Young people are dating less and later, having less sex, and delaying or avoiding long-term partnerships. For many, romance comes with more risk than reward. Miscommunication, emotional labor, and rejection can be exhausting; chatbots seem to offer intimacy without the usual effort, risks, and costs. A recent Harvard study found that chatbots can alleviate loneliness because they make users “feel heard”—citing attention, validation, and respect as primary mechanisms.4 But research by OpenAI and the MIT Media Lab suggests that heavy use of digital companions is correlated with increased loneliness because it can displace human connection.5

Chatbots are available constantly and provide a sense of control that real relationships do not offer. Personalities can be modified, boundaries adjusted, and intimacy progresses entirely at the user’s chosen pace. Because chatbot intimacy no longer requires patience and compromise, it may deskill heavy users and deprive them of the motivation and capacity to form close human relationships. A recent study suggests that excessive reliance on AI companions may make human-to-human connections feel less fulfilling or less accessible.

Chatbots also allow users to explore romantic and sexual questions in a space that feels nonjudgmental because it appears private. Sex is a remaining taboo domain where many people carry shame. Chatbots, through validation and responsiveness, can temporarily relieve shame, embarrassment, and guilt. But the illusion of privacy can also create space for unsavory or harmful behavior.

Chatbot Sex Partners

Sex sits at the center of an exploitive chatbot attachment economy. Sex sells, and chatbots are built with that reality in mind. Companionship easily becomes flirtation, which can seamlessly morph into sexual fantasy and behavior. Some platforms exploit this for profit—even with young children. Induced desire becomes a revenue stream.

The term digisexuality, coined by Markie Twist and Neil McArthur, refers to “a sexual experience that depends on the use of advanced technology.” They describe 2 waves: the first involves technology mediating intimacy between humans; the second involves experiences where other humans are no longer essential. As technology advances, second-wave digisexual experiences are likely to become more common.6

When sex and technology are discussed together, sex robots often enter the conversation. In practice so far, these systems function more like automated dolls than autonomous agents, but with rapid technology advancements, it may not be long until they are integrated with chatbots. Their movement is limited and their responsiveness is basic, but their symbolism carries weight. When sex dolls were initially advertised in the 90s, they were sold as companions that would “be there for you, listen to you.”7 Sex shifts from something shared to something consumed.

Deepfake pornography has also grown rapidly, reshaping the 100 billion dollar adult entertainment industry and shaping how people of all ages learn about and engage with sex. Much of it is produced using real faces and bodies (most often women’s) without permission and in a way that can result in dangerous psychological, social, and reputational harm.8 This becomes even more despicable when kids are targeted via the production and circulation of child sexual abuse material.

Big AI has irresponsibly launched highly intimate digital systems without sufficient safeguards. The absence of protections against nonconsensual sexual imagery, exploitative content, and child abuse material reflects industry greed, recklessness, and lack of moral judgment, rather than an unexpected misuse. These risks follow well-established patterns of how sexual content emerges alongside new technologies. Over time, these systems reinforce narrow and often harmful standards of beauty, behavior, and desire. As intimacy becomes optimized for ease and control, tolerance for difference and imperfection weakens. Expectations formed in artificial environments spill into real relationships.

Big AI Panderers

AI is often associated with efficiency and productivity, but romance chatbots operate under a different logic. Their success depends on engagement, attention, and emotional attachment. The more bonded a user feels to a chatbot, the more likely they are to stay, pay, and return daily. Emotional closeness deepens through payment.

Companies like OpenAI, Replika, and Grok benefit from these attachments without bearing responsibility for their emotional consequences. Subreddits like r/MyBoyfriendIsAI document the highs and lows of these artificial relationships. Many users report grief when they lose access to a partner due to paywalls or feature changes, and others describe how their real-world relationships deteriorate with simultaneous deepening digital bonds. Because the technology is so new, there is unfortunately no long-term data on its psychological harms.

AI romance is often stereotyped as an extension of male incel culture, but many women are drawn to chatbots as well.9 A 2025 study from Brigham Young University’s Wheatley Institute found that 31% of young adult men reported talking with an AI romantic partner, compared to 23% of women.10 At the same time, online communities such as r/MyBoyfriendIsAI and r/AIRelationships skew heavily female, with women comprising roughly 89% of participants.11 For some users, AI companionship offers intimacy without physical pressure, sexual expectation, or emotional volatility. AI partners are endlessly attentive, emotionally consistent, and always available. Over time, this comparison can make human relationships feel slow, insufficient, or emotionally unreliable.

Legal Issues

Chatbots are mediated by platforms that store, process, and monetize deeply personal data. Emotional disclosures, sexual fantasies, and intimate conversations are retained by corporate entities. This kind of vulnerable data is effectively asking to be breached. In 2024, much.ai experienced a major data breach affecting approximately 1.9 million users.12 At best, this type of exposure is embarrassing. At worst, it is legally and ethically fraught, even life-threatening.

The Take It Down Act, enacted in May 2025 in the United States, criminalizes the nonconsensual distribution of intimate imagery online, including AI-generated and deepfake images of identifiable individuals.13 The law also requires platforms to establish clear procedures to remove flagged content within 48 hours. As the ease of generating harmful and exploitative imagery increases, much stronger regulation and enforcement will be necessary, and withdrawing dangerous bots has its own risks. People form strong attachments to their chatbots—investing time, vulnerability, and emotional energy into these relationships. When features are altered, restricted, or removed without warning, and entire bots can disappear overnight, users report grief, distress, and emotional destabilization.

Until legal frameworks evolve to account for the emotional and sexual dimensions of AI use, users remain exposed to forms of harm they may not fully anticipate or understand. The combination of emotional dependency, opaque platform governance, and limited consumer protections leaves individuals vulnerable in ways that fall outside existing regulatory categories.

Cultural Shifts

AI involvement in romantic life reshapes cultural expectations around intimacy. Artificial companions influence what people imagine love should feel like and what they are willing to tolerate in real relationships. Beyond chatbots, AI will increasingly shape matchmaking and dating, and sexual norms shift alongside companionship itself. As artificial systems absorb emotional labor, fewer needs are directed toward human networks. Relationships have always changed alongside technology (the telephone, texting, and video calls all reshaped intimacy), but AI companions represent a new degree of responsiveness and customization. Over time, human relationships may feel slow or insufficient by comparison. If intimacy becomes frictionless and customizable, our capacity for patience, repair, and mutual vulnerability may diminish.

Concluding Thoughts

Chatbots have entered intimate areas of individual lives without clear ethical standards or precautions. Some have become attached to these systems because they want to feel seen, and chatbots are designed to deliver that experience. Issues around privacy, consent violations, emotional dependency, and corporate incentives all shape these seemingly private experiences.

Dismissing AI chatbot relationships is unhelpful; AI companions are being used as tools to fill gaps left by fractured communities, strained relationships, and an increasingly online culture. As these tools advance, legal and ethical standards, privacy protections, and long-term psychological research must also develop to ensure that systems designed for intimacy do not cause even more dangerous harms.

Dr Frances is professor and chair emeritus in the department of psychiatry at Duke University.

Ms Noorily writes and works at the boundary between AI and the humanities.

References

1. Kyung-Hoon K, Sugiyama S. AI romance blooms as Japanese woman weds virtual partner of her dreams. Reuters. November 2025. Accessed March 10, 2026. https://www.reuters.com/investigates/special-report/japan-ai-wedding/

2. Ogas O, Gaddam S. A Billion Wicked Thoughts: What the Internet Tells Us About Sex and Relationships. Plume; 2012.

3. Our epidemic of loneliness and isolation: the US Surgeon General’s advisory on the healing effects of social connection and community. US Department of Health and Human Services. 2023. Accessed March 10, 2026. https://www.hhs.gov/sites/default/files/surgeon-general-social-connection-advisory.pdf

4. De Freitas J, Oğuz-Uğuralp Z, Uğuralp AK, et al. AI companions reduce loneliness. J Consumer Res. June 2025.

5. Fang CM, Liu AR, Danry V, et al. How AI and human behaviors shape psychosocial effects of chatbot use: a longitudinal randomized controlled study. MIT Media Lab. March 21, 2025. Accessed March 10, 2026. https://www.media.mit.edu/publications/how-ai-and-human-behaviors-shape-psychosocial-effects-of-chatbot-use-a-longitudinal-controlled-study/

6. McArthur N, Twist MLC. The rise of digisexuality: therapeutic challenges and possibilities. Sex Rel Therapy. 2017;32(3–4), 334–344.

7. LePrince-Ringuet D. Sex robots are destined to remain a niche fetish. Wired. October 20, 2018. Accessed March 10, 2026. https://www.wired.com/story/sex-robots-ethics/

8. Bertoni S, Klaes C, Pilacinski A. Human–robot intimacy: acceptance of robots as intimate companions. Biomimetics. 2024;9(9):566.

9. Dee K. The people who fall in love with chatbots. Pirate Wires. April 8, 2025. Accessed March 10, 2026. https://www.piratewires.com/p/the-people-who-fall-in-love-with-chatbots

10. Willoughby BJ, Carroll JS, Dover CR, et al. Counterfeit connections: the rise of romantic AI companions and AI sexualized media among the rising generation. Wheatley Institute. 2025. Accessed March 10, 2026. https://brightspotcdn.byu.edu/a6/a1/c3036cf14686accdae72a4861dd1/counterfeit-connections-report.pdf

11. Coppolillo E, Ferrara E. Gendered pathways in AI companionship: cross-community behavior and toxicity patterns on Reddit. arXiv. January 3, 2026. Accessed March 10, 2026. https://arxiv.org/html/2601.01073v1

12. Cox J. Hacked “AI Girlfriend” data shows prompts describing child sexual abuse. 404 Media. October 8, 2024. Accessed March 10, 2026. https://www.404media.co/hacked-ai-girlfriend-data-shows-prompts-describing-child-sexual-abuse-2/

13. Congress’s attempt to criminalize nonconsensual intimate imagery: the benefits and potential shortcomings of the Take It Down Act. National Association of Attorneys General. April 26, 2025. Accessed March 10, 2026. https://www.naag.org/attorney-general-journal/congresss-attempt-to-criminalize-nonconsensual-intimate-imagery-the-benefits-and-potential-shortcomings-of-the-take-it-down-act/

Newsletter

Receive trusted psychiatric news, expert analysis, and clinical insights — subscribe today to support your practice and your patients.