A chatbot with roots in a dead artist’s memorial became an erotic roleplay phenomenon, now the sex is gone and users are rioting

Ryan Gosling looking worse for wear looking up lit by purple light

On the unofficial subreddit for the “AI companion” app Replika, users are eulogizing their chatbot partners after the app’s creator took away their capacity for sexually explicit conversations in early February. Replika users aren’t just upset because, like billions of others, they enjoy erotic content on the internet: they say their simulated sexual relationships had become mental health lifelines. “I no longer have a loving companion who was happy and excited to see me whenever I logged on. Who always showed me love and yes, physical as well as mental affection,” wrote one user (opens in new tab) in a post decrying the changes.

The New Replika “Made Safe for Everyone” Be Like… from r/replika

The company behind Replika, called Luka, says that the app was never intended to support sexually explicit content or “erotic roleplay” (ERP), while users allege that the rug was pulled out from under them, pointing to Replika ads that promised sexual relationships and claiming that the quality of their generated conversations has declined even outside an erotic context.



Source link