To combat misinformation, researchers have developed AntiFake, a tool that prevents your voice from being used for deepfakes. By adding noise to audio recordings before publishing them online, the tool prevents them from being exploited by artificial intelligence.
This will also interest you
[EN VIDÉO] The 10 Most Dangerous Artificial Intelligence Threats Artificial intelligence has enabled advances in health, research and…
Just as ugly sweaters neutralize facial recognition, researchers at the University of Washington have developed a system that uses speech synthesis to protect against deepfakes. In addition to making it seem like someone said something they didn’t mean, audio deepfakes can also be used for phone scams. The system, called AntiFake, is inspired by cybercriminals’ attacks against artificial intelligence.
The tool is a filter that adds noise to an audio clip after recording but before publishing online. The method is reminiscent of the method developed by MIT to protect photos. Even if the voice remains completely intelligible to a human, any deepfake created from an AntiFake-protected recording is very easy to identify.
Effective protection against most speech synthesizers
“We slightly alter the recorded audio signal, distorting or distorting it just enough so that it still sounds authentic to human listeners, but for AI it’s completely different,” explains Ning Zhang, one of the project’s creators. The authors provide sample audio clips before and after applying the AntiFake filter on the project page. Although this first version is promising, it seems to have some limitations, as the protected clips look like a low-end microphone in a bathroom next to an open faucet…
The researchers successfully tested their system with five of the most advanced speech synthesizers. Currently, AntiFake can protect short clips, but researchers believe the tool can also protect longer clips or even music. However, it may only be a matter of time before AI can bypass this type of protection. The source code is available on the project’s GitHub page.