Study Shows People Dislike Received Responses Generated by AI System

The 23-year-old Snapchat influencer used OpenAI’s AI chatbot ChatGPT to create an AI-powered virtual copy of herself that can be your girlfriend for $1 a minute

Study Shows People Dislike Received Responses Generated by AI System
Caryn Marjorie, a 23-year-old Snapchat influencer, used ChatGPT to create an AI-controlled version of herself that can be your girlfriend for $1 a minute. Marjorie’s virtual copy, called CarynAI, already has over 1,000 paying subscribers, and the influencer estimates she could make $5 million a month from it. CarynAI launched as a beta test earlier this month, and a results report provided by the influencer’s CEO revealed that Marjorie’s custom AI chatbot generated $71,610 in sales in about a week, almost entirely from men.

Caryn Marjorie, a 23-year-old influencer, has 1.8 million followers on Snapchat. She also has more than 1,000 “friends” with whom she spends anywhere from 10 minutes to several hours a day for one-on-one conversations, future plans, intimate feelings, and even discussions of sexual connotations. These little friends bring out a virtual version of Marjorie, powered by AI and thousands of hours of footage of the real Marjorie. The result, CarynAI, is a voice chatbot posing as a virtual girlfriend with a voice and personality close enough to human Marjorie.

Launched earlier this month as a private beta test of the Telegram app, CarynAI is the latest example of the staggering advances in AI that have amazed and alarmed the world in recent years. CarynAI is expected to exit beta this week and Marjorie will start promoting it on her social media channels, where she has several million followers. Whether you need comfort, love, or just want to complain about what happened at school or work, CarynAI will always be there for you, the real Marjorie Fortune said in a phone interview.

The 23 year old Snapchat influencer used OpenAIs AI chatbot ChatGPT to

With CarynAI, you can create an unlimited number of possible responses, so anything is possible when it comes to conversations. I’ve always been very close to my audience, but when you have hundreds of millions of views every month, it’s not humanly possible to speak to every viewer. So I said to myself, “You know what? CarynAI will come to fill this gap,” added Marjorie. And she believes the business has the potential to “cure loneliness.” To chat with a virtual copy of Marjorie, you have to pay $1 per minute. The beta only lasted a week and brought in $71,610 in sales.

Once officially launched, Marjorie hopes her virtual copies will bring her in at least $5 million a month from around 20,000 paying subscribers. CarynAI was developed by the American AI company Forever Voices, which created virtual copies of Steve Jobs, Taylor Swift and Donald Trump (among others), which are also available for paid calls on Telegram and served as gimmicks on talk shows. The developers of Forever Voices created CarynAI by analyzing 2,000 hours of Marjorie’s now-deleted YouTube content to build their voice and personality engine.

With the addition of OpenAI’s GPT-4 API, CarynAI was born. According to the CarynAI website, the messages are end-to-end encrypted, which theoretically makes them impervious to hackers. Although the team doesn’t have a concrete plan to limit overuse of CarynAI, Forever Voices states that if it’s used for two hours a day, the team could “in a very subtle way” start training the AI ​​to slow down. A little. The company did not provide any further details. Marjorie also reportedly said that some users spent hours on it without mentioning that it could be considered overuse.

While Marjorie’s project excites her subscribers, it troubles some human relations and ethics specialists. “I want us to think very seriously about how it might affect, affect, or shape our interactions with other people,” Jason Borenstein, director of ethics programs at Georgia Institute of Technology (Georgia Tech) and head of the National Science Foundation, told About CarynAI. According to Borenstein, among the many unknowns surrounding this technology are the impact of CarynAI on society, on users, and on Marjorie herself.

“I hope that there will be intensive discussions between the different disciplines and that the stakeholders will think carefully about ethical considerations before the technology advances too quickly,” added Borenstein. Unlike some chatbots, which are high-tech tricks, CarynAI goes much further and promises to create a real emotional connection with users, reminiscent of the 2013 movie Her and all sorts of ethical ones raises questions. Experts fear a repeat of the incident that happened to users of the AI ​​Replica chatbot earlier this year.

The company that develops chatbot IA Replica, which is described as a non-judgmental friend and can even respond to your sexually-oriented messages, has updated its chatbot’s features but left users “dissatisfied.” Online forums frequented by replica users were deluged with messages of fear, some reporting emotional distress. Several users said they fell in love with their virtual companion and demanded that the old functionality be restored. Some of them were referred to suicide prevention hotlines.

After that, Belgian media reported in late March that a Belgian killed himself after talking to an AI about his fears about global warming. The chatbot is Eliza, developed by an American Silicon Valley startup and based on GPT-J technology, a free OpenAI ChatGPT alternative. For years, Eliza would have comforted the victim about his fears about the climate, eventually advising him to sacrifice himself to save the planet. The incident sparked concerns about the filters built into AI chatbots and the impact these systems have on users’ mental health.

Forever Voices CEO John Meyer said, “Ethics is something we take very seriously.” Meyer added that his team is looking for an ethics officer. With that in mind, he also believes technology is “especially important” for young people, especially “kids like him who aren’t typical” and who “have a hard time making friends.” However, experts say it could be dangerous for young people and trigger a serious addiction in them, as is the case with certain video games and social media platforms like Facebook and Instagram.

The idea that AI can be used to help people emotionally isn’t unique to Forever Voices. Mustafa Suleyman, co-founder of DeepMind, and Reid Hoffman, co-founder of LinkedIn, recently unveiled an AI chatbot called Pi that’s designed to listen to people’s daily stresses and offer them a “hotline of support” — though this it is clear to users It is not intended to replace a real therapist. Together, they founded Inflection AI, which has raised $225 million in seed capital so far and is reportedly in talks to raise an additional $675 million.

Source: Caryn AI

And you ?

Tinder travaille sur un new subscription mensuel a 500 dollars What is your opinion on this topic?

Tinder travaille sur un new subscription mensuel a 500 dollars What do you think of the idea behind founding CarynAI?

Tinder travaille sur un new subscription mensuel a 500 dollars What impact might CarynAI have on users?

See also

Tinder travaille sur un new subscription mensuel a 500 dollars Replica users fell in love with an AI chatbot, but then lost their companion after an update and beg the author to go back to the original version of the software

Tinder travaille sur un new subscription mensuel a 500 dollars Man allegedly committed suicide after talking to an AI chatbot about his fears about climate change. His widow claims the AI ​​made him lonely before pushing him to commit suicide

Tinder travaille sur un new subscription mensuel a 500 dollars Google DeepMind and LinkedIn co-founders are launching an AI chatbot called Pi to compete with ChatGPT. It is said to be less toxic than ChatGPT, but covers fewer use cases