Fact check: Audio of JD Vance criticizing Elon Musk real?
March 26, 2025
An audio clip, which claims to be of Vice President JD Vance criticizing Elon Musk has gone viral on social media, prompting Vance's communications director to state that the recording is not real. The clip does not mention Musk by name, but it includes phrases like "he is from South Africa" and "he wants to tank down the economy and his cars," which suggest it refers to Musk, who was born in South Africa and owns Tesla.
Vice President Vance immediately distanced himself from the audio on X, saying the voice in the clip was not his but AI-generated. He wrote, "I'm not surprised this guy doesn't have the intelligence to recognize this fact, but I wonder if he has the integrity to delete it now that he knows it's false." He has warned to take legal action against the user who posted it, if it was not removed, adding, "otherwise, it will be a case of defamation." The clip was still online when this article was written on March 26.
DW Fact check and ARD-Faktenfinder investigated the clip's authenticity. Here's what we found.
Claim: In the viral audio, purportedly of Vance criticizing Musk, a voice can be heard saying, "Everything that he's doing is getting criticized in the media, and he says that he's helping, but he's not. He's making us look bad. He's making me look bad (…) But he's not even an American. He is from South Africa."
Verdict: Likely fake.
Most investigation indicates the audio is AI-generated
The audio recording is of poor quality, with significant noise and noticeable pauses, which are both potential signs of manipulation. "The unusually low audio quality, a common trick to conceal evidence of manipulation or synthesis, is highly suspicious," said Hany Farid, a digital forensics expert from the UC Berkeley School of Information.
Elon Musk, the owner of X (formerly Twitter), does not hold an official position in the Trump administration. However, he heads the newly-formed agency Department of Government Efficiency (DOGE), named after his favorite cryptocurrency meme coin, which oversees potential US government budget cuts.
Musk has faced media criticism and protests, particularly over his political ties. The "Tesla Takedown" movement has led to demonstrations at Tesla dealerships, and several acts of vandalism. Tesla's stock has dropped significantly in recent months.
The audio is unclear at many points, and the term "South Africa" is not clearly audible. DW's fact-checking team has listened to it multiple times, and believes it sounds somewhere between "South Africa" and "South America." However, the subtitles read "South Africa," which may influence viewers' perceptions.
Experts emphasize that verifying an audio recording requires more than just technical analysis. It also involves linguistic and contextual evaluation, especially when high-profile figures are implicated. This includes analyzing speech patterns and pauses.
"The cadence and intonation are not consistent with Vice President Vance's typical speech patterns," says Farid. He conducted a detailed forensic analysis at DW's request, tracked down the highest-quality recording available, checked it with GetReal Labs, and concluded that, "based on our review, we believe the audio is likely inauthentic."
DW Fact check and ARD-Faktenfinder found that the earliest version of the clip was posted on TikTok on March 23. In one of the initial clips, Vance's name was misspelled as "Vence." These kinds of mistakes are common in mis- and disinformation posts and could be an indication of a purposefully fake audio.
Inconsistent AI detection results
Our investigation tested various open-source audio detection tools, yielding mixed results. While most tools flagged the clip as likely inauthentic, some were inconclusive.
A platform developed by the University at Buffalo analyzed the clip using multiple detection tools, with results indicating a 90-100% likelihood of being fake. However, another tool, Hiya, was unsure and estimated only a 47% chance of it being AI-generated. DW Fact check used the browser extension of this tool.
Germany's Fraunhofer Institute for Applied and Integrated Security (AISEC) analyzed the audio at our request, concluding that there is an 81% probability that it is fake, based on their deepfake detection tool, deepfake-total.com.
"It is not unusual for different tools to produce varying results. New or previously unknown deepfakes are often difficult to reliably identify, especially if detection systems were not specifically designed for them," said Nicolas Müller from AISEC in an interview with ARD-Faktenfinder.
These inconsistencies highlight the challenge of detecting sophisticated AI-generated content with absolute certainty, a challenge DW's fact-checking team also faces.
Deepfake technology is evolving
New techniques are making audio deepfakes increasingly sophisticated and harder to detect. A newer method, speech-to-speech synthesis, mimics a person's real voice instead of using traditional text-to-speech technology. According to experts, this technique could already be in use.
Initial analyses suggest the examined audio clip of Vance is artificial. "Although we have not performed a full biometric analysis, the voice in the recording does not match Vance," said Farid. He also warned of a growing trend of fabricated leaks targeting journalists and politicians.
If you are interested in learning more about how to spot audio deepfakes, have a look at our research here.
Carla Reveland and Pascal Siggelkow from ARD-Faktenfinder contributed to this report. This article is part of a collaboration between Germany's public broadcasting fact-checking teams ARD-Faktenfinder, BR24 #Faktenfuchs and DW Fact check.
Edited by: Rachel Baig