The picture shows a middle-aged Japanese man sleeping in his bed. A voice coming from a hologram projected onto his bedside table wakes him with a good morning. The drawing shows a woman with light blue hair. At the push of a button he starts speaking. The machine warns him to bring an umbrella and urges him so he won't be late for work. While the man is eating lunch alone in his office, he sends a text message to the hologram asking him to come home early. On the return bus he tells her that he will be there shortly. “I can’t wait to see you,” replies the artificial intelligence machine, which turns on the lights in the house to welcome its owner. When the man crosses the threshold, the first thing he does is go to her. They watch TV and the man tells him how good it feels to “have someone home.”
The video from the company Gatebox is one of the examples shown by French scientist Laurence Devillers this Thursday in her talk at the Future Congress, a meeting between world-class scientists and humanists in Santiago, Chile. Devillers, professor of artificial intelligence and ethics at Sorbonne University, has been researching affective computing for 20 years. In an interview after his lecture at the National Institute's Extension Center, he discussed the relationships between machines and humans.
Questions. The Japanese man in the video says that it feels good to have someone at home, but in reality there is no one there…
Answer. Yes, it is a complete illusion. Many people can be very vulnerable to these types of machines because loneliness is a nightmare, especially at the end of life. Maybe they can keep us company. It's not easy to discuss such things, but I don't think they should just be banned. If they make you happier, why not? Which yes, you have to take into account where your data obtained from the machine goes, what records it collects, for what purpose it accesses your information … In addition, that they are a kind of addiction. We can spend too much time interacting with this nobody, as you say. Although it is no one, it comes from you, because you are talking to a machine that learns from your history, from your tastes… This reminder of who you are has a wonderful way of manipulating you. That's the problem.
Q How do you approach this problem?
R. Standards need to be developed for these types of machines. There are three types of very important dimensions: One is the law with its red lines. Another is to support companies in the industry in setting up a legally compliant system. And the third is to try to make ethical guidelines accessible to everyone in society.
Q How tricky is it that people using these machines confuse reality with virtual reality?
R. Most of the time there is no confusion. It's the same body. The thing is that in virtual reality you can do whatever you want, there are no rules… In the Westworld series, a lot of people try to kill, steal, because there is no sin, but that's true not. After all, what you do in virtual space will also have some consequences in your real life. And we have to have some rules, just like in real life.
Q We may be spending more and more time in virtual reality where there are no consequences.
R. Yes, it is a risk. If you spend your whole life in this virtual reality, you are just a customer in the matrix. We need to educate ourselves about this system to use it well. There is also the issue of gender, race and lack of cultural diversity. These machines produce statistics. The GPT chat, for example, is mostly in English as it is fed by many opinions from the American people. If you ask what the national holiday is, you will be told July 4th, which is not your holiday. To avoid something like this, we need to build our own AI system, understand it better and use it for useful things like science, medicine, ecology. Now we are like a baby facing something we don't understand.
Q About gender bias, race… What can be done?
R. There are many more women in these machines than men. When you talk to a chatbot, its voice is female. If you are looking for a sex robot or other types of robots, there are many more girls. I once asked a major bank manager in France why the chatbot was called Greta and not Gerk. She said: I've spoken to my male clients and they prefer a woman's voice, and my female clients responded the same way. The female voice is standard everywhere. When you buy a car, it comes standard with a female voice. And if you want to change it, you have to follow the steps, which involves an effort that not everyone makes. What is this representation in people's minds? Do you think it's something positive? The question is why there are no robot men for women.
Q Does no one ask about it?
R. There are two things. One of them is that those who build the robots are men, so they build them for themselves without thinking about the consequences. There's Sofia, Alexa, Siri… Second, we don't have the same desire for dominance. For example, I would like Europe to adopt a small rule that says that when purchasing an item capable of speaking to others, the voice of a man, a woman, or a third option must be randomly placed. If your product requires a female voice, you need to explain why.
Q In her presentation, a mother was seen hugging her dead daughter through virtual reality glasses. Is the brain capable of not being fooled by something like that?
R. Yes, but it's difficult. Representing real people in these machines is a nightmare because you usually think that there is a part of the person you know in the artifact. It is very difficult to keep your distance. Now there is a market for it and the problem is always the same: when there is a market, some people hesitate. What we can do is focus on the standards that need to be met, the norms, the guidelines.
Q What rules can be established in this particular case?
R. It is a topic that needs to be better studied. I still can't say whether it's good or bad. Some countries, such as Japan, already connect a person's face to a robot and use it during mourning. The robot uses the deceased person's videos, images and recorded memories. It is able to create questions and answers, simulating a conversation between the user and the deceased. Until now it was a system that could reproduce the sentences that the deceased actually said, but when I use GPT chat or something similar, the system can make up new sentences and there is a problem because you put words in someone's mouth without your consent. As a result, many people began to go to nursing homes to collect data, saying that there is finally a way to be immortal. There is something we need to think about.
Q It is said that one of the difficulties in regulating artificial intelligence is that it is evolving so quickly that by the time the standard is adopted, it will already be outdated.
R. It's very fast, but be careful, the mystification of AI is a big topic. They are just numbers, with some data collected and mixed with others in a model that represents a black box. Then they can produce something like our language, which is wonderful. But there is no verification, there are no sources, it is uncertain. You ask Google a question – we know the results are manipulated because there are people who pay to appear first – but there are a bunch of links in the right order and you judge which ones are good or bad which ones are useful for your research, which ones seem unimportant to you. reliable, etc. They make decisions. Now it's the AI machine that does that without being aware of the content or knowing the context. Without morals. Nothing. Just statistics. For children it is a nightmare because they get the results without going through the procedure on how to get them. We need to get AI to learn how to learn.
Q You've been working on affective computing for 20 years and now there's talk of the possibility of someone falling in love with a machine. What do you think?
R. My freedom is being able to fall in love with what I want. A tree, a computer, computer chatbots, why not? It's my freedom. But I have to understand that falling in love with a machine is wrong. Remember in the movie Her, at some point the boy discovers that the machine is in love with many people. The uniqueness of love is not in the machine.
Q Could the lone Japanese man in your video be in love with the machine?
R. No. I think you can imagine this more in older people. When you are close to death and your partner is dead and your children are gone and you feel really alone. Such machines may be a useful companion there. Many people do the same thing with pets. It's clear to me that if you have some stability in your life and have people around you, you don't want these types of machines. I'm sure if you're alone and don't have many people around it might help. I don't know. This question needs to be thought about… It is a problem of machine coevolution. Artificial intelligence is not intelligent, but it is artificial, it is an artifact that we built like a car. If you prefer spending your time with objects, why not? In my opinion, we need to be aware of the risks together and build more community and exchange. Understand what is happening and help others not get lost in this transition. It is important to be vigilant and figure out how best to use this type of machinery.
Subscribe here Subscribe to the EL PAÍS Chile newsletter and receive all the important information about current events in the country.