1708350689 Before you meet up with an AI virtual girlfriend read

Before you meet up with an AI virtual girlfriend, read this

The dating industry of all kinds is on the rise fast, as we say at home. Today she has a very good idea of ​​what she can achieve with the power of artificial intelligence (AI) systems and tools.

Barely born last year, “AI friends” are already flooding the GPT store, despite the rules and barriers introduced by the OpenAI company. While others are proliferating in the App Store and Google Play Store for mobile applications.

The driving force behind it: people's eternal loneliness

In the context of an epidemic of loneliness, the rise of conversation robots and romantic partners relies on an inexhaustible resource: the isolation of hundreds of millions of people.

The proliferation of these apps comes at a time when the United States (and this includes Canada and the West in general) is facing an epidemic of loneliness and isolation. Worrying studies show that one in two American adults report suffering from loneliness, and the U.S. Surgeon General Administration has called for the need to increase social connections.

But researchers have found that these chatbots aren't best friends when it comes to protecting secrets.

Before you meet up with an AI virtual girlfriend, read this

Eva AI

“Confidentiality not included”

The Mozilla Foundation staff has published Privacy Not Included, a consumer guide that evaluates the privacy policies of technology and other products. It “examined 11 chatbots marketed as romantic companions and found that they came with warning labels, putting them on par with the worst product categories ever studied for privacy.”

In addition, some privacy policies like Chaï's date back more than three years, well before the advent of online AI, and are therefore completely unsuitable for AI-managed conversational robots.

Before you meet up with an AI virtual girlfriend, read this

Very talkative “AI friends” about your little secrets: 24,354 ad trackers in a minute!

Your privacy with romantic chatbots? Forget it, all your desires and dirty secrets are carefully stored and resold on all advertising servers.

Consider Romantic AI, a service that lets you “create your own AI girlfriend.” According to Mozilla, promotional images on the homepage showed a chatbot sending a message saying: “I just bought new underwear.” Want to see them? » According to the researchers' analysis, the app's privacy documents indicate that it does not sell user data. However, when they tested the app, they found that it “sent 24,354 ad trackers in one minute of use.” Other monitored applications have sent more than a hundred (tracking cookies).

Before you meet up with an AI virtual girlfriend, read this

Romantic AI

Is the chatbot a person?

According to Mozilla's findings, almost half of apps (54%) don't allow you to delete your personal data.

According to reports, only half of apps give all users the right to delete their personal data. But be aware that your conversations will not always be taken into account. Even if romantic conversations with your AI soulmate seem private to you, they are not necessarily considered “personal information” and are not subject to special treatment. Often, as Romantic AI says, “communication via the chatbot is part of the software.”

In other words, the “software” is considered a “person,” so that person has the right to record their own conversations—and yours—just as anyone can record their conversations.

Short list of AI love sites

If you type “romantic ai” in the Apple App Store application (iPhone and iPad apps tab) on your Mac, the list of available apps is very, very long.

But let's focus on the AI ​​romance apps Mozilla studied.

Before you meet up with an AI virtual girlfriend, read this

Replica AI

Tips to protect your privacy, according to Mozilla

  • When talking to your AI-generated partner, do not reveal sensitive information.
  • Request that your data be deleted if you no longer use the app. Simply deleting an app from your device will generally not delete your personal information or terminate your account.
  • Do not agree to the application's continuous tracking of your geolocation. It is best to provide geolocation “only while using the app”.
  • Do not share sensitive information through the app.
  • Do not allow access to your photos and videos or your camera.
  • Do not log in using third party accounts.
  • Do not connect to a third-party service through the Application or at least ensure that the third-party service adopts appropriate privacy practices.
  • When talking to your AI-generated partner, do not reveal sensitive information.
  • Choose a complicated password. You can use a password manager like 1Password, KeePass, etc.
  • Do not use social media plugins.
  • Use your device's privacy settings to limit access to your personal information through the app (do not allow access to your camera, microphone, photos and location unless necessary).
  • Update your applications and systems regularly.
  • Limit ad tracking on your device (for example, on iPhone, use Privacy, Ads, and Limit Ad Tracking) and across major advertising networks (for Google, go to your Google Account and turn off ad tracking). Ad personalization).
  • When you register, you agree to opt-out of data tracking if possible.

Mainly in English

To use an app or AI relational agent, you must speak English. When browsing the App Store list, you need to type “amour ia” or similar terms to find it in our language.

But regardless of the language, privacy is the priority. Anything you say to the AI ​​bot is stored and resold and can be sent to the authorities (god knows where) and used against you.

Caution.

Next Step ? Real robots made of flesh and metal!