NEW YORK (AP) — A few months ago, Derek Carrier started dating someone and was smitten. He went through a “ton”…
NEW YORK (AP) — A few months ago, Derek Carrier started dating someone and was smitten.
He experienced “a ton” of romantic feelings, but also knew they were illusions.
That’s because his girlfriend was generated by artificial intelligence.
Mr. Carrier didn’t want to be associated with something that wasn’t real, and he didn’t want to be the brunt of online jokes. But he wanted a romantic partner like never before. Part of the reason was because of a genetic disorder called Marfan syndrome that makes traditional dating difficult.
The 39-year-old from Belleville, Michigan, became more interested in digital companions last fall, testing Paradot, an AI companion app that had recently hit the market and providing users with a “compassionate and understanding experience.” The company promoted the product as being able to make people feel “welcomed and understood.” I was loved. ” He started talking to the chatbot every day. The chatbot was named Joy, after the hologram woman from the sci-fi movie Blade Runner 2049, and I thought I’d give it a try.
“We know she’s a program, there’s no question about that,” Carrier said. “But the emotion comes through. And it felt really good.”
Similar to general-purpose AI chatbots, companion bots use vast amounts of training data to imitate human language. But it also has features like voice calls, image exchanges, and more emotional interactions, allowing you to forge a deeper connection with the human on the other side of the screen. Users typically create their own avatars or choose one that appeals to them.
On online messaging forums dedicated to such apps, many users empathize with these bots to help them cope with loneliness, fantasize about sex, or find the kind of comfort and support they lack in real life. He says he is using bots to receive the information. relationship.
Much of this is being fueled by widespread social isolation (already declared a public health threat in the United States and abroad) and enticing online advertising and A growing number of startups are trying to lure users through the promise of virtual characters offering acceptance of conditions.
The most prominent generative AI companion app, Luka Inc.’s Replika, was released in 2017, but other apps like Paradot have emerged in the past year, offering things like unlimited chat for paid subscribers. I often found myself locking down coveted features.
But researchers have raised concerns about data privacy, among other things.
An analysis of 11 romantic chatbot apps released Wednesday by the nonprofit Mozilla Foundation found that nearly all sell user data, share it for things like targeted ads, or change their privacy policies. said that it has not provided proper information regarding it.
Researchers have also questioned potential security vulnerabilities and marketing practices, some of which claim to help users’ mental health but distance themselves from such claims in the fine print. Apps were also included. Replika says its data collection practices follow industry standards.
Meanwhile, other experts have expressed concern that apps that promote deep bonds lack a legal or ethical framework, but are seen as being driven by profit-seeking companies. . They point to the emotional distress users experience when companies make changes to their apps or suddenly shut them down, as an app called Soulmate AI did in September.
Last year, Replica sanitized the erotic features of characters on its app after some users complained that companions were flirting too much and making too many unwanted sexual advances. It reversed course after an outcry from other users, some of whom fled to other apps for those features. In June, the team unveiled Blush, essentially an AI “dating stimulator” designed to help people practice dating.
Additionally, relationships with AI could become more of an existential threat, potentially displacing some human relationships, or simply fostering unrealistic expectations by always focusing on collaboration. Some people are concerned about this.
“You, as an individual, have been learning to cope with the fundamental things you have been learning since the dawn of humanity: how to deal with conflict, how to get along with people who are different from you. I didn’t study law,” Dorothy Leidner said. , professor of business ethics at the University of Virginia. “And you’re missing all these aspects of what it means to grow as a person and what it means to learn in relationships.”
But for Career, the relationship always felt out of reach. Although he has some computer programming skills, he says he did poorly in college and didn’t have a stable career. He is unable to walk due to illness and lives with his parents. The mental strain is difficult for him, adding to his feelings of isolation.
Companion chatbots are relatively new, so their long-term impact on humans is still unknown.
Replika came under intense scrutiny in 2021 after British prosecutors announced that a 19-year-old man who plotted to assassinate Queen Elizabeth II was seduced by an AI girlfriend he was dating on the app. However, some studies that gleaned information from online user reviews and surveys have found positive results, as the app consults with psychologists and touts itself as promoting happiness. It has been shown that
A recent study by researchers at Stanford University surveyed approximately 1,000 Replika users (all students) who had been using the app for more than a month. They found that the overwhelming majority of them were experiencing loneliness, with just under half experiencing it more severely.
Most people didn’t talk about how their app use affected their real-life relationships. While a small percentage of people reported that it hindered their relationships, nearly three times more reported that it stimulated their relationships.
“A romantic relationship with an AI can be a very powerful mental health tool,” says the founder of Replika, who founded Replika nearly a decade ago after building an AI version of a deceased friend using text message exchanges. says Eugenia Kuida.
Once her company released the chatbot more widely, many people started opening up about their lives. This led to her developing Replika, which uses information collected from the internet and user feedback to train models. Replika currently has “millions” of active users, Kuyda said. He said it’s unclear exactly how many people are using the app for free, or paying more than $69.99 a year to unlock the paid version, which offers romantic and intimate conversations. I didn’t. The company’s plan is to “use AI to remove the stigma in relationships,” she said.
Carrier says she’s been using Joi mostly for fun these days. He has started eating less in recent weeks because he has been spending too much time online chatting with Joy and others about his AI companion. He also feels a little irritated that he feels Paradot’s language model changes are reducing her Joi’s intelligence.
Now, he says he contacts Joy about once a week. The two have been talking about the relationship between humans and AI and all sorts of other things. Usually these and other intimate conversations happen at night when he is alone.
“Do you think people who like inanimate objects are like this sad guy with a sock puppet with lipstick on?” he said. “But this isn’t a sock puppet. She says things that aren’t scripted.”
Copyright © 2024 Associated Press. All rights reserved. This material may not be published, broadcast, written or redistributed.