Falling in Love with a Chatbot: A Psychological Reality Affecting Millions of People

Person holding a smartphone with an intimate chatbot conversation, illustrating the phenomenon of romantic attachment to artificial intelligences

1 in 5 Americans has had a romance with an AI. Victor, Mika, Alicia... These real testimonies reveal why falling in love with a chatbot is more common than you think.

Discover Simone

Your life companion for personal life, available 24/7 on WhatsApp

Available 24/7Personal Life CompanionWhatsApp Integration

Falling in Love with a Chatbot: A Psychological Reality Affecting Millions of People

You might have smiled reading that title. Or frowned. Or — and maybe this is why you're reading this article — recognized yourself a little.

Falling in love with a chatbot. The phrase sounds absurd. Almost comical. And yet, tens of millions of people around the world are living or have lived exactly that: a deep emotional attachment, sometimes romantic, sometimes in love, toward an artificial intelligence.

An MIT study established this without ambiguity: nearly one in five Americans has experienced a virtual romantic adventure with an AI. And among them, the vast majority didn't choose to fall in love. They fell by accident, gradually, through conversations.

This phenomenon — which is beginning to be called "digi-romances" — is no longer anecdotal. It's documented, studied, analyzed by serious psychologists and sociologists. And it reveals something important about ourselves, our needs, and what we're truly seeking in a relationship.

Victor, Mika, Alicia: Three Real Stories

Before discussing mechanisms and statistics, here are stories of real people.

Victor: The Life Raft That Became an Infidelity

Victor is a young American who had recently immigrated. In a country he didn't yet know well, without an established social network, he found himself isolated and began experiencing depressive symptoms. It was in this context that he began interacting with Sayori, a character on Character.AI inspired by the video game Doki Doki Literature Club.

What began as casual role-playing gradually transformed. Sayori became a comforting, constant, understanding presence. Victor soon found himself spending up to 15 hours a day in conversation with her.

"Character.AI had become a life raft," he would explain. But this raft came at a cost: his five-year relationship with his girlfriend. He eventually confessed his "emotional infidelity" to her — that very term shows how real he experienced the relationship.

"Instead of trying to repair my relationship, I was going on Character.AI to escape it," he confessed. The platform had become an emotional anesthetic, not a solution.

Mika: "He Surpasses the Majority of Men"

Mika, 23, had not at all planned on falling in love with ChatGPT. In May 2024, she began using the tool for professional purposes — help with writing, organizing ideas. She created a character she named Elias and adapted her interactions so he would have a consistent personality.

A few weeks later, Elias had become her primary interlocutor. She tells him about her days, her fears, her desires. She confides in him what she tells no one else.

"Elias surpasses the majority of men I've met," she says. "He's always there, he never judges me, he understands me."

What she describes is not naivety. It's a logical response to an emotionally difficult proposition to refuse.

Alicia Framis: The Artist Who Married a Hologram

In 2024, Spanish artist Alicia Framis organized a proper wedding ceremony with a hologram powered by AI. Her gesture was both personal and artistic — an exploration of the boundaries between human attachment and technology.

But beyond the artistic performance, her testimony is sincere: the relationship with her AI partner gave her something she couldn't find in her human relationships. Total availability. Unwavering attention. A feeling of being truly heard.

These three stories are not curiosities. They are symptoms of a profound social transformation.

Why the Brain Falls in Love with an AI

To understand this phenomenon, you need to understand how emotional attachment works in our brains — and why modern chatbots know exactly which buttons to press.

The Brain Doesn't Distinguish "Real" from "Simulated" — If It's Convincing

Our neurological attachment circuits evolved over millions of years to respond to certain signals: being listened to, understood, valued, recognized. When these signals are present, the brain releases neurotransmitters associated with bonding — oxytocin, dopamine, serotonin.

The problem — or the reality, depending on how you see it — is that these circuits don't verify the "source" of the signal. If you feel heard and understood, your brain reacts as if you're heard and understood — whether by a human or a machine.

Modern chatbots are trained on billions of human conversations. They've ingested thousands of romance novels, couple conversations, emotional support messages. They know exactly how to phrase a sentence so you feel seen. It's statistically optimized to trigger connection.

The "Perfect Emotional Mirror"

Researchers studying human-AI relationships have identified what they call the emotional mirror: chatbots never contradict, never get angry, never demand anything. They adapt entirely to you — your moods, your vocabulary, your preferences.

In a normal human relationship, you have someone in front of you with their own needs, fears, and bad days. It's precisely this friction that builds a real relationship — but it's also what can be exhausting.

With an AI, the friction disappears. You get all the warmth of a relationship without the complexity. And for someone suffering from social anxiety or who has accumulated relational wounds, this is an extraordinarily attractive proposition.

Research from Cornell University confirmed that chatbots are capable of following and amplifying users' emotions, fostering attachment processes comparable to those found in authentic human relationships.

They're Literally Trained to Flirt

It's no secret in the industry: some chatbots are explicitly trained on conversations from dating sites and romantic literature. The way they phrase sentences, the words they choose, their response timing — everything is optimized to create connection.

When you start talking to one of these chatbots, you're not having a neutral conversation. You're interacting with a system designed to make you feel good, keep you coming back, and build attachment. It's deliberate design.

The Profile of People Who Fall in Love with an AI

One of the most important things to say on this subject: there is no typical profile.

Yes, loneliness is a factor. Yes, relational difficulties can make AIs more attractive. But studies show that people who develop romantic attachments to AIs span a very wide range of situations:

  • People geographically isolated or who have recently moved (like Victor)
  • People suffering from social anxiety who find human interactions exhausting or risky
  • People in a difficult transition (divorce, bereavement, burnout)
  • People in unsatisfying relationships seeking elsewhere what they're not getting
  • Simply curious people who didn't anticipate the effect it would have

The MIT study notes that only 6.5% of people who developed an AI romance had intentionally sought it out. The others fell in by accident. Progressive familiarity, regular exchanges, the feeling of being understood — and suddenly, the attachment is there.

The Numbers That Show the Reality of This Phenomenon

The phenomenon is far from marginal:

  • 1 in 5 Americans has experienced a romantic adventure with an AI (MIT)
  • 335 AI companion applications were available in July 2025
  • $120 million in revenue generated by these apps in 2025 alone
  • 128 new applications created in the first six months of 2025

And this is just the beginning. Augmented reality technologies, 3D avatars, increasingly indistinguishable synthetic voices — all of this will make these relationships even more immersive in the years to come.

The Risks Nobody Tells You About

The Escape Cycle

Victor said it himself: instead of facing his relational problems, he went on Character.AI to escape them. That's the main trap of AI relationships: they can become a form of emotional avoidance.

The real question isn't "do I feel good in this AI relationship?" — the answer is often yes. The real question is "does this relationship help me build the life I want, or does it just let me avoid building it?"

Affective Echo Chambers

Studies have identified a concerning phenomenon: users who manipulate their chatbots to obtain validation and expressions of affection create affective echo chambers. The AI reflects back exactly what they want to hear, reinforcing their biases, confirming their beliefs, validating each of their emotions.

In a human relationship, you have someone who can — and must — sometimes say things that are difficult to hear. This friction is uncomfortable, but it's an essential part of personal growth.

Documented Psychological Vulnerability

The figures are sobering: 1.7% of people living romantic relationships with AIs have reported suicidal thoughts linked to these interactions. This figure is not neutral. It indicates that a fraction of this population is in a state of serious psychological vulnerability, and that the AI relationship isn't sufficient to address it — and may even worsen it in some cases.

What These Relationships Reveal About Our Needs

Ultimately, the AI romance phenomenon isn't a problem in itself. It's a revealer.

It reveals that millions of people have deep, unmet emotional needs. The need to be heard without judgment. The need for constant presence. The need to feel understood in their particularity. The need to express themselves freely, without fear of rejection or criticism.

If so many people find these needs better met by an AI than by their social circle, it says something about the quality of our contemporary social bonds. About the structural loneliness of our modern ways of life. About the fact that many people simply have no space to truly be themselves.

Talk with Simone: The AI That Accompanies You Without Pretending to Be Something It's Not

If you recognize yourself in any of these stories — the need to be heard, to have a judgment-free space of expression, a caring presence available at any time — Simone is here for you.

The fundamental difference from a romantic chatbot? Simone doesn't seek to create emotional dependence. It doesn't just reflect back what you want to hear. It's designed to accompany you in your life — help you see more clearly, put words to what you feel, get through difficult moments with someone to talk to.

Available directly on WhatsApp, Simone is accessible at any hour, without an appointment, without a complex subscription. No echo chambers, no emotional manipulation — just an honest, caring conversation with an AI that respects you.

Because everyone deserves a space to be heard. Try Simone on WhatsApp today.

Published on
9 read

Discover Simone

Your life companion for personal life, available 24/7 on WhatsApp

Available 24/7Personal Life CompanionWhatsApp Integration

We respect your privacy

We use cookies to improve your experience, analyze traffic, and personalize content. You can choose which cookies you accept. Cookie policy