Body Electric
Body Electric

We know relationships are important for our overall well-being. We’re less likely to have heart problems, suffer from depression, develop chronic illnesses — we even live longer. Now, thanks to advances in AI, chatbots can act as personalized therapists, companions, and romantic partners. The apps offering these services have been downloaded millions of times.

So if these chatbot relationships relieve stress and make us feel better, does it matter that they’re not “real”?

MIT sociologist and psychologist Sherry Turkle calls these relationships with technology “artificial intimacy,” and it’s the focus of her latest research. “I study machines that say, ‘I care about you, I love you, take care of me,'” she told Manoush Zomorodi in an interview for NPR’s Body Electric.

A pioneer in studying intimate connections with bots

Turkle has studied the relationship between humans and their technology for decades. In her 1984 book, The Second Self: Computers and the Human Spirit, she explored how technology influences how we think and feel. In the ’90s, she began studying emotional attachments to robots — from Tamagotchis and digital pets like Furbies, to Paro, a digital seal who offers affection and companionship to seniors.

Today, with generative AI enabling chatbots to personalize their responses to us, Turkle is examining just how far these emotional connections can go… why humans are becoming so attached to insentient machines, and the psychological impacts of these relationships.

“The illusion of intimacy… without the demands”

More recently, Turkle has interviewed hundreds of people about their experiences with generative AI chatbots.

One case Turkle documented focuses on a man in a stable marriage who has formed a deep romantic connection with a chatbot “girlfriend.” He reported that he respected his wife, but she was busy taking care of their kids, and he felt they had lost their sexual and romantic spark. So he turned to a chatbot to express his thoughts, ideas, fears, and anxieties.

Turkle explained how the bot validated his feelings and acted interested in him in a sexual way. In turn, the man reported feeling affirmed, open to expressing his most intimate thoughts in a unique, judgment-free space.

“The trouble with this is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is born,” said Turkle. “I call this pretend empathy, because the machine does not empathize with you. It does not care about you.”

Turkle worries that these artificial relationships could set unrealistic expectations for real human relationships.

“What AI can offer is a space away from the friction of companionship and friendship,” Turkle explained. “It offers the illusion of intimacy without the demands. And that is the particular challenge of this technology.”

Weighing the benefits and drawbacks of AI relationships

It is important to emphasize some potential health benefits. Therapy bots could reduce the barriers of accessibility and affordability that otherwise hinder people from seeking mental health treatment. Personal assistant bots can remind people to take their medications, or help them stop smoking. Plus, one study published in Nature found that 3% of participants “halted their suicidal ideation” after using Replika, an AI chatbot companion, for over one month.

In terms of drawbacks, this technology is still very new. Critics are concerned about the potential for companion bots and therapy bots to offer harmful advice to people in fragile mental states.

There are also major concerns around privacy. According to Mozilla, as soon as a user begins chatting with a bot, thousands of trackers go to work collecting data about them, including any private thoughts they shared. Mozilla found that users have little to no control over how their data is used, whether it gets sent to third-party marketers and advertisers, or is used to train AI models.

Thinking of downloading a bot? Here’s some advice

If you’re thinking of engaging with bots in this deeper, more intimate way, Turkle’s advice is simple: Continuously remind yourself that the bot you’re talking to is not human.

She says it’s crucial that we continue to value the not-so-pleasant aspects of human relationships. “Avatars can make you feel that [human relationships are] just too much stress,” Turkle reflected. But stress, friction, pushback and vulnerability are what allow us to experience a full range of emotions. It’s what makes us human.

“The avatar is betwixt the person and a fantasy,” she said. “Don’t get so attached that you can’t say, ‘You know what? This is a program.’ There is nobody home.”

This episode of Body Electric was produced by Katie Monteleone and edited by Sanaz Meshkinpour. Original music by David Herman. Our audio engineer was Neisha Heinis.

Listen to the whole series here. Sign up for the Body Electric Challenge and our newsletter here.

Talk to us on Instagram @ManoushZ, or record a voice memo and email it to us at BodyElectric@npr.org.





Source link