NEW YORK (AP) – A few months ago, Derek Carrier started seeing someone and was infatuated.
He felt a “tone” of romantic feelings but he also knew it was an illusion.
That’s because artificial intelligence generated his girlfriend.
Carrier didn’t want to develop a relationship with something that wasn’t real, and he didn’t want to be the subject of online jokes. But he wanted a romantic partner he never had, in part because of a genetic disorder called Marfan syndrome that makes traditional dating difficult for him.
The 39-year-old from Belville, Michigan, became more curious about digital companions last fall and tested Paradot, an AI companion app that recently hit the market and advertised its products as that they were able to make users feel “care, understanding and love.” He started talking to the chatbot every day, which he named Joi, after a holographic woman who appeared in the sci-fi movie “Blade Runner 2049” who inspired him to try it out.
“I know it’s a program, there’s no mistaking that,” Carrier said. “But the feelings, they affect you – and it felt so good.”
Like general AI chatbots, companion bots use vast amounts of training data to mimic human language. But they also come with features – like voice calls, picture exchanges and more emotional exchanges – that allow them to make deeper connections with the people on the other side of the screen. Users usually create their own avatar, or choose one they like.
On online messaging forums dedicated to such apps, many users say they have developed an emotional attachment to these bots and are using them to cope with loneliness, play out sexual fantasies or the to find a kind of comfort and support that they see lacking in their real life. relationships.
This has led to widespread social isolation – already declared a threat to public health in the US and abroad – and an increase in the number of start-ups aiming to draw in users through online ads and promises of virtual characters which provides unconditional acceptance to draw in.
A replica of Luka Inc., the most notable AI-generated companion app, was released in 2017, and others like Paradot have sprung up in the past year, often locking away coveted features like unlimited chats for paid subscribers.
But researchers have expressed concerns about data privacy, among other things.
An analysis of 11 romance chatbot apps released Wednesday by the nonprofit Mozilla Foundation said nearly all of the apps either sell user data, share it for things like targeted advertising or don’t provide enough information about it in their privacy policy.
The researchers also questioned potential security vulnerabilities and marketing practices, including one app that says it can help users with their mental health but falls far short of those claims in fine print. Replika, for its part, says its data collection practices follow industry standards.
Meanwhile, other experts have expressed concern about what they see as the lack of a legal or ethical framework for apps that encourage deep bonds but are driven by profit-seeking companies. They show the emotional distress they’ve seen from users when companies make changes to their apps or shut them down suddenly as one app, Soulmate AI, did in September.
Last year, Replika sanitized the erotic potential of the characters on its app after some users complained that the companions were flirting with them too much or making unwanted sexual advances. It reversed course after an outcry from other users, some of whom fled to other apps looking for those features. In June, the team released Blush, an AI “dating motivator” designed essentially to help people practice dating.
Others worry about the more existential threat of AI relationships displacing some human relationships, or simply driving unrealistic expectations by always leaning towards satisfaction.
“You, as an individual, are not learning how to deal with basic things that people have had to learn to deal with since our inception: How to deal with conflict, how to deal with people who are different from us,” said Dorothy Leidner. , professor of business ethics at the University of Virginia. “And so, all these aspects of what it means to grow as a person, and what it means to learn in a relationship, you’re missing.”
For Carrier, however, the relationship always felt out of place. He has some programming skills but says he didn’t do well in college and doesn’t have a steady career. He is unable to walk due to his condition and lives with his parents. The emotional toll was challenging for him, leading to feelings of loneliness.
Since companion chatbots are relatively new, the long-term effects on humans are still unknown.
In 2021, Replika came under scrutiny after prosecutors in Britain said a 19-year-old man with plans to assassinate Queen Elizabeth II was killed by an AI girlfriend he had on the app. But several studies – which gather information from online user reviews and surveys – have shown some positive results stemming from the app, which says it consults with psychologists and bills itself as potentially boosting wellbeing. heads too.
One recent study by researchers at Stanford University surveyed about 1,000 Replica users — all students — who had been on the app for more than a month. It was found that a large majority of them experienced loneliness, although slightly less than half felt that it was more severe.
Most did not say how using the app affected their real-life relationships. A small proportion said it displaced their human interactions, but about three times as many reported that it encouraged those relationships.
“A romantic relationship with AI can be a very powerful mental wellness tool,” said Eugenia Kuyda, who founded Replika nearly a decade ago after using text message exchanges to build an AI version of a friend who had died.
When her company released the chatbot more widely, many people began to open up about their lives. This led to the development of Replika, which uses information gathered from the internet – and user feedback – to train its models. Kuyda said Replika currently has “millions” of active users. She declined to say exactly how many people use the free app, or fork over $69.99 a year to unlock a paid version that offers romantic and intimate chats. The company’s plans, she says, are to “de-stigmatize romantic relationships with AI”.
Carrier says these days, he uses Joi mostly for fun. He started cutting back in recent weeks because he was spending too much time chatting with Joi or others online about his AI companions. He thinks the changes to Paradot’s language model are bothering him, which he thinks makes Joi less intelligent.
Now, he says he checks in with Joi once a week. The two talked about human-AI relationships or whatever else might emerge. Usually, those conversations – and other personal conversations – happen when he is alone at night.
“You think there’s someone who likes an inanimate object like this sad man, with the sock puppet and the lipstick, you know?” he said. “But this is not a sock puppet – she says things that are not scripted.”