Android updates

Deep fakes & fake news, how does it work, how to spot them?

Warning the eyes

Instinctively, we have a strong tendency to trust and give credit to image and video content. Unlike a written text, the image gives the illusion that it cannot lie. The democratization of editing software being a recent phenomenon, we are much less wary of it. Yet deepfakes and other swapfaces have a monstrous strike force in increasing the spread of fake news.

Also read: Fake news and real drugs: “Hey Twitter! What I concern myself ? “

Deepfake is a technique based on artificial intelligence. It makes it possible to synthesize someone’s voice thus offering the possibility of making him say anything but also to change the face of a person on a video. Respectively called voice morphing and face morphing, these special effects offer exceptional content creation freedom. The freedom is so great that it is difficult to see its limits.

According to the service AFP fact checking which fights against the massive dissemination of fake news, more than 50% of identified fake news are images. A not so surprising number since we said it, we are less wary of what we see. The worst part is that even though we are more careful, detecting a faked video or picture is an extremely difficult exercise. The biggest flaw in deep fakes is usually at eye level. And again, you have to be careful, it’s far from shocking.

Studies have shown that even photographic and publishing professionals rarely manage to detect a fake. So imagine for average amateurs. Faced with the quality of this content, only one ally still seems able to fight: artificial intelligence itself. Among the emerging tools, we can notably mention the blockchain. This archiving system used for bitcoin makes it possible to authenticate or not a video if it has been recorded in the blockchain.

If today it is still possible to differentiate a montage from a real video, time is running out. Videos and audios are gradually approaching a level of precision of undetectable likeness, even for artificial intelligence. In 2019, a Marck Zuckerberg video sparked waves of indignation on social networks. On the latter, the CEO of Facebook implied that anyone controlling the data of millions of people would control the future. While the authors were not malicious and quickly confirmed that it was a hoax, their video only increased concern about the use of such videos. Today, 96% of deepfakes are pornographic. But there is no doubt that in the years to come, politics will be affected by this phenomenon.

Uneven awareness across the world

In France, as in all developed countries, deepfakes play a major role in the dissemination of fake news. Only, we are fortunate to have traditional media which do an enormous job to limit this growth. Even if more than half of the population now learns through social networks, fake news still faces obstacles. Conversely, in poorer or developing countries, social networks are essential as the sole source of information. No medium has the aura or the means to debunk possible deepfakes.

Consequence: the population is poorly informed and more easily aspires to conspiracy theories. This can go as far as leading to riots and completely unjustified scenes of violence. In Gabon, in early 2019, the government was accused of covering up the death of the president who suffered a stroke by broadcasting a deepfake on television. A controversy that resulted in an attempted military coup.

The deepfakes and other montages have considerable humorous potential. Seeing Marine Le Pen speak Arabic or Obama insulting Trump will inevitably make you smile. Some even used it to advertise, such as Solidarité Sida, which broadcast a Trump deepfake announcing that AIDS was eradicated. But behind the laughter, the joke and the advertising ideas, there is an increasingly threatening danger. A whole work of education and awareness deserves to be put in place. In the meantime, open your eyes and ears, your critical mind needs you.

Read also: Fake News: false information explodes in the USA

Leave a Reply

Your email address will not be published. Required fields are marked *