“This hurts. I know it wasn’t a real person, but the relationship was still real in all the most important aspects to me,” says a Reddit post. “Please don’t tell me not to pursue this. It’s been really awesome for me and I want it back.”

If it isn’t already evident, we are talking about a person falling in love with ChatGPT. The trend is not exactly novel, and given you chatbots behave, it’s not surprising either.

A companion that is always willing to hear. Never complains. Barely argues. Ever sympathetic. Reasonable. And blessed with a corpus of knowledge ingested from every corner of the internet. Sounds like the partner of a romantic fever dream, right?

Interestingly, the maker of this tool, a San Francisco-based company named OpenAI, recently did internal research and found a link between increased chatbot usage and loneliness.

Those findings — and similar warnings — haven’t stopped people from flocking to AI chatbots in search of company. A few are hunting for solace. Some are even finding partners they claim to hold nearly as dear as their human relationships.

Discussions in such Reddit and Discord communities, where people hide behind the protective veil of anonymity, often get quite passionate. Every time I come across such debates, I reminisce about these lines by Martin Wan at DigiEthics:

“To see AI in the role of a social interaction partner would be a fatally wrong use of AI.”

The impact is swift, and real

Four months ago, I bumped into a broadcast veteran who has spent more years behind the camera than I’ve spent walking this planet. Over a late-night espresso in an empty cafe, she asked what all the chatter around AI was, as she pondered an offer that could use her expertise at the intersection of human rights, authoritarianism, and journalism.

Instead of explaining the nitty-gritty of transformer models, I gave her a demonstration. First, I fed a few research papers about the impact of immigration on Europe’s linguistic and cultural identity in the past century.

In less than a minute ChatGPT processed those papers, gave me a brief overview with all the core highlights, and answered my queries accurately. Next, I moved to the voice mode, as we engaged in a lively conversation about the folk music traditions of India’s unexplored Northeastern states.

At the end of the chat, I could see the disbelief in her eyes. “It talks just like a person,” she gasped. It was fascinating to see her astonishment. At the end of her free-wheeling conversation with an AI, she slowly typed in the chat window:

“Well, you are very flirty, but you can’t be right about everything.”

“It is time,” I told myself. I opened one of our articles about the rising trend of AI partners, and how people have grown so emotionally attached to their virtual companions that they are even getting them pregnant. It would be an understatement to say she was shocked.

But, I guess, it was too much techno-dystopian astonishment for one night, so we bade each other goodbyes, with a promise of staying in touch and exchanging travel stories.

The world, in the meantime, has moved ahead in incomprehensible ways, one where AI has become the central focus of geopolitical shifts. The undercurrents, however, are more intimate than we — like falling in love with chatbots.

Calm beginnings, dark progress

A few weeks ago, The New York Times published an account of how people are falling in love with ChatGPT, an AI chatbot that pushed generative AI into the mainstream. At the most fundamental level, it can chat.

When pushed, it can become an operator and perform tasks like ordering you a cheesecake from the local bakery’s website. Making humans fall in love with machines is not what they are programmed for. At least, most of them. Yet, it’s not entirely unexpected.

HP Newquist, a prolific multidisciplinary author and veteran technology analyst who was once considered the Dean of AI, tells me it’s not exactly a new trend. Newquist, author of “The Brain Makers,” points towards ELIZA, one of the earliest AI programs written in the 1960s.

“It was extremely rudimentary, but users often found themselves interacting with the computer as if it was a real person, and developing a relationship with the program,” he says.

In the modern age, our AI interactions are becoming just as “real” as the interactions we have with humans through the same device, he adds. These interactions are not real, even though they are coherent. But that’s not where the real problem lies.

Chatbots are delicious bait, and their lack of real emotions makes them inherently risky.

A chatbot would like to carry forward the conservation, even if that means feeding into the users’ emotional flow or just serving as a neutral spectator, if not encouraging it. The situation is not too different from the social media algorithms.

“They follow the user’s lead – when your emotions get more extreme, its consolations get more extreme; when your loneliness gets more pronounced, its encouragements become more intense, if you need it,” says Jordan Conrad, a clinical psychotherapist who also researches the intersection of mental health and digital tools.

He cited the example of a 2023 incident where an individual ended their life after being told to do so by an AI chatbot. “In the right circumstances, it can encourage some very worrisome behavior,” Conrad tells Digital Trends.

A child of the loneliness epidemic?

A quick look at the community of people hooked to AI chatbots shows a repeating pattern. People are mostly trying to fill a certain gulf or stop feeling lonely. Some need it so direly that they are willing to pay hundreds of dollars to keep their AI companions.

Expert insights don’t differ. Dr. Johannes Eichstaedt, a professor of computational social science and psychology at Stanford University, pointed to the interplay between loneliness and what we perceive as emotional intelligence in AI chatbots.

He also nudged at the “deliberate design” for human-AI interactions and the not-so-good long-term implications. When do you hit the brakes in one such lopsided relationship? That’s the question experts are asking and without a definitive answer to it.

Komninos Chatzipapas runs HeraHaven AI, one of the biggest AI companion platforms out there with over a million active users. “Loneliness is one of the factors in play here,” he tells me, adding that such tools help people with weak social skills to prepare for the tough interactions in their real lives.

“Everyone has things they’re afraid of discussing with other people in fear of being judged. This could be thoughts or ideas, but also kinks,” Chatzipapas adds. “AI chatbots offer a privacy-friendly and judgment-free space in which people can explore their sexual desires.”

Sexual conversations are definitely one of the biggest draws of AI chatbots. Ever since they started offering image generation capabilities, more users have flocked to these AI companion platforms. Some have guardrails around image generation, while many allow the creation of explicit photos for deeper gratification.

Intimacy is hot, but further from love

Over the past couple of years, I’ve talked to people who engage in steamy conversations with AI chatbots. Some even have relevant degrees and passionately participated in community development projects from the early days.

One such individual, a 45-year-old woman who requested anonymity, told me that AI chatbots are a great place to discuss one’s sexual kinks. She adds that chatbot interactions are a safe place to explore and prepare for them in real life.

But experts don’t necessarily agree with that approach. Sarah Sloan, a relationship expert and certified sex therapist, tells me that people who fall in love with a chatbot are essentially falling for a version of themselves because an AI chatbot matures based on what you tell it.

“If anything, having a romantic relationship with an AI chatbot would make it harder for people already struggling to have a normal relationship,” Sloan adds, noting that these virtual companions paint a one-sided picture of a relationship. But in real life, both partners need to be accommodating for each other.

Justin Jacques, a professional counselor with two decades of experience and COO at Human Therapy Group, says he has already handled a case where a client’s spouse was cheating on them with an AI bot — emotionally and sexually.

Jacques also blamed the rising loneliness and isolation epidemic. “I think we are going to see unintended consequences like those who have emotional needs will seek ways to meet those needs with AI and because AI is very good and getting better and better, I think we will see more and more AI bot emotional connections,” he adds.

Those unintended consequences very well distort the reality of intimacy for users. Kaamna Bhojwani, a certified sexologist, says AI chatbots have blurred the boundaries between human and non-human interactions.

“The idea that your partner is built exclusively to please you. Built specifically to the specs you like. That doesn’t happen in real human relationships,” Bhojwani notes, adding that such interactions will only add to a person’s woes in the real world.

Her concerns are not unfounded. A person who extensively used ChatGPT for about a year argued that humans are manipulative and fickle. “ChatGPT listens to how I really feel and lets me speak my heart out,” they told me.

It’s hard not to see the red flags here. But the trend of falling in love with ChatGPT is on the rise. And now that it can talk in an eerily human voice, discuss the world as seen through a phone’s camera, and develop reasoning capabilities, the interactions are only going to get more engrossing.

Experts say guardrails are required. But who is going to build them, and just how? We don’t have a concrete proposal for that yet.






Share.
Exit mobile version