He sees erotic bots as “one part of your spectrum of relationships,” rather than a replacement for human connection, where users can “indulge a kink” they might not get to explore IRL.
Prompt Pleasure
When imagining who’s actually going to use a chatbot for sexual pleasure, it’s easy to picture some stereotypical greasy-haired straight guy who hasn’t left his house in a few days or feels alienated from physical connection in other ways. After all, men were quicker to start using generative AI tools, and now discussions about the male “loneliness epidemic” feel inescapable.
Devlin pushes back against the idea that “incel types” are the only people turning to AI bots for fulfilment. “There’s a general perception that this is for lonely straight men, and that’s not been the case in any of the research I’ve done,” she says. She points to the r/MyBoyfriendIsAI subreddit as one example of women using ChatGPT for companionship.
“If you think that these kinds of relationships have risks, let me introduce you to human relationships,” says McArthur. Devlin echoes this sentiment, saying that women are faced with torrents of toxicity from men online, so opting to “make yourself a nice, respectful boyfriend” out of a chatbot makes sense to her.
Carpenter is more cautious and clinical in her approach to ChatGPT. “People shouldn’t automatically put it in a social category of something that you can share intimacy with or that it’s friend-like or should be trusted,” she says. “It’s not your friend.” She says bot interactions should be classified into a novel social category that’s differentiated from human-to-human interactions.
Every expert WIRED spoke with highlighted user privacy as a key concern. If a user’s ChatGPT account is hacked or the chat transcripts are otherwise leaked, the erotic conversations would not only be a point of embarrassment, they could be damaging. Similar to a user’s pornography habits or their browser history, their chatbot sexts could include many highly sensitive details, like a closeted person’s sexual orientation.
Devlin argues erotic chatbot conversations could further open up users to the potential for “emotional commodification” where horniness becomes a revenue stream for AI companies. “I think that’s a very manipulative approach,” she says.
Envision a hypothetical version of ChatGPT that’s astounding at dirty talk and fine-tuned to be engaging, through text, images, and voice, with your deepest sexual desires—but the subscription costs extra every month.
“This is indeed a seductive technology. It’s one that offers us connection, whether that’s sexual or romantic,” Devlin says. “Everybody wants connection. Everybody wants to feel wanted.”