This Snapchat Star Will Become Your Virtual Girlfriend For 1

This Snapchat Star Will Become Your Virtual Girlfriend For $1 A Minute – Journal du geek

Caryn Marjorie, a 23-year-old American influencer, recently made headlines with an experience gone wrong. With the help of a machine learning company, she created her own virtual avatar, presented as an “AI friend”. And as is often the case with this still nebulous technology, the clone appears to have gotten out of control; He started sexually explicit conversations… which quickly made the young woman a fortune.

The model in question, called CarynAI, is a chatbot based on GTP-4, the OpenAI LLM that powers the famous ChatGPT. He was trained by startup Forever Voice to mimic his personality using videos from the Instagram star’s (now-deleted) YouTube channel.

Originally, the idea was to be able to offer its hundreds of thousands of subscribers an “immersive experience”. “I’m very close to my subscribers, but when you have millions of views every month, it’s humanly impossible to speak to each person individually,” she explains in an interview with Fortune.

“My generation has suffered the negative effects of the isolation caused by the pandemic, which has left many people too afraid to reach out to someone they like,” she told Business Insider. “So I was like, ‘You know what? CarynAI will fill this gap.”

Hot markets are not on the agenda

And it’s not just about inventing the numbers. The influencer believes that this approach is in the public interest. She even claims that the company behind this digital double can “cure loneliness,” exactly that. But the program seems to have slightly overstepped its authority. The avatar not only offers emotional support; he began Making progress that couldn’t be clearer to the users who pushed it in this direction.

Alexandra Sternlicht, a Fortune reporter who has spent some time experimenting with the chatbot, provides some very rough examples. For example, CarynAI offered to undress her while whispering sweet things in her ear. The prospect explains to the insider that his team is currently working on an update to close the door on this type of interaction.

Aside from the ultimately more anecdotal controversy surrounding these sexual statements, the idea of ​​offering subscribers more real-life interactions seems to have worked well — at least financially.

$70,000 in a week

Within a week, the service, billed at $1 per minute, would have generated more than $70,000 in sales. According to Fortune’s interview, Caryn Marjorie believes her AI could be more than profitable for her 5 million dollars a month.

And this success raises a whole series of ethical questions that are sure to become even more urgent in the near future. While the idea of ​​using AI to provide psychological support to people in pain is not new, with the current advances in artificial neural networks, it could soon become a reality on a large scale.

We can expect a new generation of AI companions to emerge who could take on the role of mentor, confidant, or even romantic partner.

Psychology experts currently seem to be divided on this issue. Some consider it pure madness. Others believe AI could be a great tool to help certain patient profiles overcome unhealthy shyness or trauma, for example. But everyone seems to agree on one point: You must be extremely careful while using these tools.

Virtual people, for better or for worse

Because while they appear very real, the personality of these larger-than-life avatars is obviously just one illusion. It is solely the product of a carefully calibrated algorithm that has processed a carefully sorted data set beforehand, nothing more and nothing less. They are certainly much more complex, but these programs are not no more human than a Siri assistant or an answering machine.

The better they can mimic the nuances of human relationships, the blurrier the line becomes. And that could have dramatic consequences for certain user profiles. Some observers already see it as a target that rogue companies could use to make big bucks on the backs of certain people, particularly the most vulnerable. Even if it means taking cynicism to the extreme, one can imagine that particularly isolated people could develop a genuine addiction to these fake people, who are always forgiving and programmed to do exactly what is expected of them .

Although these projects are still in their infancy, it is advisable to pay special attention to this topic. Because if machine learning is indeed an excellent tool in many respects, care must also be taken to preserve what makes authentic human relationships special. It remains to be seen how humanity will cope with this major technological shift fraught with ethical and philosophical implications.