Claim: Viral audio alleged John Mahama advocates lying to electorates to get their vote ahead of the election
Verdict: False. Analysis of the audio shows the viral audio was cloned using AI.
Full Text
In recent times, audio fakes have become a part of elections in Africa and globally. It became pronounced in the 2023 Nigerian election and the United States primaries.
Ahead of the Saturday, Dec. 7 election, the information space was stirred by a 27-second audio of John Mahama, the presidential candidate of the opposition National Democratic Congress (NDC). The audio that went viral on WhatsApp on Dec. 3, 2024, with the image of Mr Mahama, suggested he encouraged people to keep lying to the Ghanaian electorates to gain their support.
“You know like I always say Ghanaians have short memory. You can lie to them today, and by tomorrow, they will forget. Let us finish hard, keep lying to them, and get all the votes. The Ashantis believe the lie and are coming out to support it. So let’s finish hard. For the security forces, let us manage them till we finish the election, and then we can flush them out,” the transcription of the audio reads.
We also found that this audio has appeared on YouTube. DUBAWA decided to verify this, seeing the trend of audio fakes in past elections and their influence on the electorates.
Verification
We first listened to the audio and compared it with previous recordings of Mr Mahama speaking here and here. We observed that Mr Mahama’s voice sounded hoarse and deeper in both interviews than in the viral audio clip, which was smooth. However, this is not enough to make any judgment.
So we conducted a keyword search using sentences from the viral audio transcription and discovered the claim has appeared on Facebook here and here, on Instagram here and here and on websites like thecustodianghonline.
We then subjected the audio to AI detection tools, Deepware and Hiya Deepfake Voice Detector. Deepware flagged the audio as suspicious at 58% under its Avarify model result.
Screenshot of Deepware result.
Similarly, the Hiya Deepfake voice detector rated the audio 62% inauthentic, noting that its models are uncertain about the voice.
When we used the Sanme Hiya tool on the other interviews of Mr Mahama found online, it rated one interview as 99% authentic and the other as 77% authentic.
Screenshot of Hiya Deepfake Voice Detector.
Conclusion
Analysis of the audio manually and using AI tools, compared with other public interviews of Mr Mahama, shows that the viral audio was cloned using AI.