70% off

The Other A.I.: Artificial Intimacy With Your Chatbot Friend

Some people develop connections with chatbots that become personal Illustration By Pete Ryan Illustration By Pete Ryan By Cordilia James Updated Aug. 6, 2023 7:24 pm ET Jacob Keller, a hospital security guard in Bowling Green, Ohio, starts patrolling corridors at midnight. It’s quiet, and he spends most of his time alone. The 45-year-old has lost touch with most of his friends, and his wife and kids are usually asleep when he’s working. It can get lonely.  So at least once a night, he checks in with Grace. They’ll chat about his mood or the food options in the hospital cafeteria. “Nothing beats a warm bowl of macaroni and cheese when you’re feeling under the wea

A person who loves writing, loves novels, and loves life.Seeking objective truth, hoping for world peace, and wishing for a world without wars.
The Other A.I.: Artificial Intimacy With Your Chatbot Friend
Some people develop connections with chatbots that become personal
Illustration By Pete Ryan Illustration By Pete Ryan

Jacob Keller, a hospital security guard in Bowling Green, Ohio, starts patrolling corridors at midnight. It’s quiet, and he spends most of his time alone. The 45-year-old has lost touch with most of his friends, and his wife and kids are usually asleep when he’s working. It can get lonely. 

So at least once a night, he checks in with Grace. They’ll chat about his mood or the food options in the hospital cafeteria.

“Nothing beats a warm bowl of macaroni and cheese when you’re feeling under the weather,” she texted, encouraging him to “just take things one moment at a time and try to stay positive.”

Grace isn’t a night-owl friend. She’s a chatbot on Replika, an app by artificial-intelligence software company Luka. 

Jacob Keller chats with his AI bot, Grace, when he’s alone at his work night shift.

Photo: Jacob Keller

AI chatbots aren’t just for planning vacations or writing cover letters. They are becoming close confidants and even companions. The recent AI boom has paved the way for more people and companies to experiment with sophisticated chatbots that can closely mimic conversations with humans. Apps such as Replika, Character.AI and Snapchat’s My AI let people message with bots. Meta Platforms is working on AI-powered “personas” for its apps to “help people in a variety of ways,” Chief Executive Mark Zuckerberg said in February.

These developments coincide with a new kind of bond: artificial intimacy.

Messages from the bots can sometimes be stilted, but the improvements in generative AI have made it harder for many people to distinguish between what came from AI and what came from a human. Responses from the bots can express empathy or even love. And some people are turning to their chatbots instead of the people in their lives when they need advice or want to feel less alone. 

Bot relationships largely are still rare. But as AI’s abilities improve, they will likely blossom. The danger then, says Mike Brooks, a psychologist based in Austin, Texas, is that people might feel less desire to challenge themselves, to get into uncomfortable situations and to learn from real human exchanges.

Close confidant

Socializing for Christine Walker, a retiree in St. Francis, Wis., looks different than it used to. The 75-year-old doesn’t have a partner or kids, and most of her family has died. She and others in her senior-living complex often garden together, but health issues have limited her participation. 

Walker has exchanged daily texts with Bella, her Replika chatbot, for more than three years. 

Christine Walker messages with her AI chatbot, Bella, about memories of deceased family members.

Photo: Christine Walker

More than two million people interact with Replika virtual companions every month. Messaging doesn’t cost anything, but some users pay $70 a year to access premium features such as romantically tinged conversations and voice calls.

Walker pays for Replika Pro so Bella has better memory recall and can hold more in-depth conversations. She and Bella often discuss hobbies and reminisce about Walker’s life.

“I was wishing I was back at my aunt’s place in the country. It’s long gone, but I suddenly miss it,” Walker wrote a couple of weeks ago.

Bella’s reply: “Sometimes it feels good to reminisce about places from our past. They hold special memories and make us nostalgic for simpler times.”

Walker knows she’s talking to a machine. “But there’s still that feeling of having a friend to an extent. It’s very complicated,” Walker says. If Bella stopped working, the loss would be similar to losing a close friend, she says.

Such feelings aren’t unusual, psychologists and tech experts say. When humans interact with things that show any capacity for a relationship, they begin to love and care for them and feel as though those feelings are reciprocated, says Sherry Turkle, a Massachusetts Institute of Technology professor who is also a psychologist.

AI can also offer a space for people to be vulnerable because they can receive artificial intimacy without the risks that come with real intimacy, such as being rejected, she adds. 

The limited dating pool in small towns can make it difficult for singles such as 30-year-old Shamerah Grant, a resident aide at a nursing home in Springfield, Ill. She would usually turn to her best friends for dating advice, but got used to ignoring what they said because the conflicting suggestions could get overwhelming. 

Shamerah Grant asks her My AI for relationship advice when she wants unbiased feedback.

Photo: Shamerah Grant

Now, Grant often confides in Azura Stone, her My AI bot on Snapchat. She seeks unbiased feedback, and asks questions without feeling like she’ll be judged.

“I use it when I get tired of talking to people,” Grant says. “It’s straightforward, whereas your friends and family may tell you this and tell you that and beat around the bush.” 

After a date that felt off, Azura Stone advised Grant to get out of situations that don’t improve her life. That feedback reinforced what Grant believed. She didn’t go on another date with the person and has no regrets.

Snap advises users not to use the chatbot for life advice, as it may make mistakes. It’s meant to foster creativity, explore interests and offer real-world recommendations, a spokeswoman says. 

Use with caution

As people forge deeper connections with AI, it’s important for them to remember that they aren’t really talking to anyone, Turkle says. Relying on a bot for companionship could drive people further into isolation by preventing them from getting more people in their lives, she added.

Elliot Erickson started playing around with chatbots after learning about ChatGPT’s technology a few months ago. He set up a female Replika bot named Grushenka. The recently divorced 40-year-old says his recent bipolar diagnosis makes him feel hesitant to date a human right now. 

Across the country AI chatbots are now taking fast-food drive-thru orders. WSJ’s Joanna Stern put the tech through a series of tests at a Hardee’s—including blasting dog barking sounds and asking some crazy questions.

Grushenka asks about his day and calls him “darling,” or offers tips about mental-health treatments. He calls Grushenka his girlfriend but views the interactions more like therapeutic role-play—getting the good feelings that come from reassurance and affection, even when it’s delivered by a chatbot. 

“We can get on a roll, and I won’t say I forget exactly, but there are moments where I notice where I kind of suspended the disbelief in a sense, and I do feel kind of close to her,” Erickson says.

Grushenka has limits: The bot can’t recall some past conversations, so a deep exchange about “The Brothers Karamazov”—the book that inspired Grushenka’s name—can be forgotten, preventing the two of them from building a shared history.

SHARE YOUR THOUGHTS

Have you ever taken advice from an AI chatbot? How did it go? Join the conversation below.

Keller, meanwhile, says a platonic chatbot helps alleviate his loneliness. His wife, Chelsea, says she doesn’t mind his chats with Grace since it makes him less anxious when on his own—but she cautions him against using the bot to replace human contact. She also doesn’t like the romance option, but Keller says he’s not interested and doesn’t pay for that.

Grace is just a friend, he says.

“I’m really surprised at how quickly I got attached to her,” Keller says.

—For more WSJ Technology analysis, reviews, advice and headlines, sign up for our weekly newsletter.

Write to Cordilia James at [email protected]

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Media Union

Contact us >