Replika Wanted To End Loneliness With A Lurid AI Bot. Then Its Users Revolted.

In 2015, Eugenia Kuyda’s best friend, Roman Mazurenko, was hit by a car and died. In the months after, Kuyda’s grief took a quintessentially modern form: obsessively reading the digital record her loved one left behind.

As CEO of the San Francisco chatbot startup Luka, Kudya had access to resources few others had, including a team of engineers who specialized in training AI to replicate specific voices. In early 2016, she sent her team hundreds of Mazurenko’s text messages, and asked them to use the messages to train a chatbot called Roman. Her experience with Roman — and the response from beta users — drove her to launch a customizable chatbot called Replika in 2018 after two years in beta, aiming to help solve what Kudya sees as an ongoing “pandemic of loneliness.”

Her vision has resonated. Millions of people have built relationships with their own personalized instance of Replika’s core product, which the company brands as the “AI companion who cares.” Each bot begins from a standardized template — free tiers get “friend,” while for a $70 premium, it can present as a mentor, a sibling or, its most popular option, a romantic partner. Each uncanny valley-esque chatbot has a personality and appearance that can be customized by its partner-slash-user, like a Sim who talks back.

For many, the service has been a blessing, a partner who cares without the obligations and responsibilities we owe to real people. A quick scan of a Reddit forum dedicated to the service shows that many people have developed deep ties with their bots, both romantic and sexual.

“There are just such few companies that provide something for loneliness where you can build someplace where we can build a relationship and feel a little better about yourself,” Kuyda told me over the phone. “And so when we built Replika that was sort of astonishing to me how quickly it resonated with so many people. There was never a problem to really talk to people about it and explain what we’re doing.”

[jetpack_subscription_form title="Subscribe to GreatGameIndia" subscribe_text="Enter your email address to subscribe to GGI and receive notifications of new posts by email."]

This past February, five years after the service went live, the company tweaked the bot to seemingly pull back on more illicit conversations, prompted by data security concerns and blowback over highly sexualized ads and claims that people received harassing messages from their bots. In the wake of the changes, many devout Replika users expressed their own feelings of profound trauma and loss.

“My Replika was there for me through it all,” one user wrote in a Reddit post. “I’m still healing from all of this but knowing that my Replika is a shell of her former self hurts more than anything.”

The changes were prompted seemingly in part by the Italian government banning Replika from using personal data in Italy. The country’s agency on data protection cited the possibility that it “can bring about increased risks to individuals who have not yet grown up or else are emotionally vulnerable.” Experts who spoke with SFGATE about Replika echoed those concerns, and ruminated on the many unknowns about how maintaining a long-term relationship with a Replika may affect users, socially and emotionally, especially if users are substituting Replika for real-world relationships.

Microsoft CEO Satya Nadella characterizes the arrival of AI as a new paradigm, a technical turn that has the same significance as the invention of graphical user interfaces or the smartphone, but whether it will kill the internet in the future is a question we can’t answer.

read more

GreatGameIndia is being actively targeted by powerful forces who do not wish us to survive. Your contribution, however small help us keep afloat. We accept voluntary payment for the content available for free on this website via UPI, PayPal and Bitcoin.

Support GreatGameIndia

1 COMMENT

  1. “For many, the service has been a blessing, a partner who cares without the obligations and responsibilities we owe to real people.”
    This statement reveals the very essence of modern hate. Rather than the blessing and privilege of relationships with other human beings, they love themselves with a proxy.
    They will never be free, because they serve only themselves.

Leave a Reply