An intelligent system that perfectly understands natural language, immediately feeds on huge amounts of data and has an answer to practically every question. ChatGPT is so good – especially in its latest version, GPT-4 – that it only took a few months for alarm bells to ring: how far can its influence go? Moral debates aside, the other major concern ChatGPT raises is the not very transparent management of the information it handles.
“We don’t know what they’re really doing with the data,” warns digital law expert Borja Adsuara, “and not knowing what they’re doing with the data is a risk in itself.” Federico Vadillo, an Akamai Security expert shares the same opinion: “The use of ChatGPT poses risks in terms of data protection and legal compliance, especially with regard to the General Data Protection Regulation, the GDPR.” The expert goes further and warns that these “unauthorized” personal data “could be transferred outside the European Union”.
More information
The adventure of deleting personal data
“Systems based on artificial intelligence (AI) tend to behave like a black box: we know what happens at the end of the process, but we don’t know how the software learns and makes decisions,” warns Adrián Moreno, Cyber Security Expert. “This phenomenon presents challenges of understanding, control and ethics. Therefore, it makes sense to create a regulatory framework to address these challenges and enable responsible and safe development and use of AI,” he adds.
Moreno recalls that “Article 15 of the GDPR establishes that ‘the data subject has the right to obtain confirmation from the data controller as to whether personal data concerning him are being processed'”, and in this sense the manner in which the AI-based use of information by systems is “a cause for concern”.
In the face of this reality, how to delete all personal data? The bad news is that there is no automated or instant way to delete stored information. That is, the conversations conducted with the system are recorded along with the user’s name and registration details, as announced in the company’s privacy policy. Since there is no immediate or automated way to access and delete the information viewed, OpenAI has provided a form that the interested party must fill out to request the data information. And here comes the worrying thing: the user is forced to accept a clause where OpenAI warns that it may not be able to remove this information.
This is done after filling out a complex form that irrefutably proves (even with screenshots) that the system provides information about yourself and the reasons for requesting removal. This document must be completed with the actual data of the interested party, who must also “swear” in writing that the information is correct. And to make matters worse, this form informs the user, where possible, that the information entered may be checked against other sources to verify its accuracy.
What is being done and how is this information used? OpenAI claims that it does not use personal data for any purpose other than improving its systems, but warns in its privacy policy that the data “may be shared with third parties without notice to the user.” In short, OpenAI offers a product with high added value, but at an even higher cost for the user: ignorance of how their data is processed. It should also be remembered that with the information viewed in ChatGPT it is easy to create a user profile and it is very tempting to exploit it. Faced with this dilemma and if you don’t want to take any risks, the wisest thing to do is to make conscious use of the contents consulted or, even more radically, to wait until you use the tool if you comply with regulations such as the GDPR : “It’s best not to give anything, just in case personal data,” advises Adsuara.
you can follow THE COUNTRY Technology on Facebook and Twitter or sign up here to receive our weekly newsletter.