Fake videos using robotic voices and deepfakes circulate in Mali

These videos are examples of misinformation circulating massively on social networks in Mali with automated voices.
These videos are examples of misinformation circulating massively on social networks in Mali with automated voices. © Observers

Social media in Mali has been chock full of videos over the last few weeks that all sound very similar. These videos have a computer-generated voiceover talking about domestic politics or France’s presence in the country. Some of them are badly-edited and quite basic, but others are more technologically advanced. Both are examples of a new type of disinformation circulating in the country.


The latest episode of Truth or Fake, which you can watch in the player above, takes a close look at a few examples of these videos found on the Facebook page Nabi Malien Den Halala, which has 82,000 subscribers and describes itself as a page that shares musical content or news about the Malian rapper TAL-B Halala.

© Facebook

But the page does not just share the latest rap news. It also specialises in sharing a particular style of disinformation: videos with a computer-generated voiceover designed to look like official television news stories.

One video on the page was seen over 900,000 times on Facebook and was widely shared on WhatsApp at the end of 2021. The video claims that the French intellectual Bernard-Henri Lévy is a spy selling information about the Malian army’s position and forces to jihadists in Mali, based on a photo of him taken when he was in Sudan in 2007 and taken out of context.

These types of fake videos are becoming increasingly popular on social media in Mali. They are often published on the YouTube account of Africa24.info, a platform presenting itself as an official media organisation but which regularly publishes pro-Russian disinformation.

Deepfake videos designed to mislead

On December 27, the Nabi Malien Den Halala page shared a video showing a television news presenter speaking to the camera in front of a TV studio background. The presenter says that France gave money to various Malian political parties to persuade them not to participate in the national consultation set up by the military junta, which has been in power since the May 24, 2021, coup d’état.

The claim stems from a poorly photoshopped letter that circulated online in Mali last September. Alleged to have been written - though not signed - by French President Emmanuel Macron, and typed  in a patchwork of different fonts, font sizes and line spacing, the letter says France will pay €23 million to Malian political parties that agree to support France’s presence in the country.

© Facebook

But the video with the presenter is not real: It was made using the free website Synthesia, which allows users to create avatars that can be made to say anything. Synthesia has helped generate a number of deepfakes – automated videos that seem authentic – like the one below of David Beckham speaking nine different languages as part of a campaign to fight against malaria.

We got in touch with the CEO of Synthesia, Victor Riparbelli, who told us that the person who created the fake news video had been banned from using the site after Synthesia’s content moderators recognised the political nature of the video, which goes against the platform’s Terms of Service.

Deepfakes are still used relatively rarely for disinformation, and are often of poor quality. But the technology that generates them is becoming increasingly accessible.