1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
CatastropheAfghanistan

Fact check: AI fakes circulating after quake in Afghanistan

September 2, 2025

Afghanistan's earthquake has left more than 1,400 dead. Amid rescue efforts, fake and AI-generated images have been spreading online. DW fact checkers debunked some of the viral claims.

Destruction in village Mazar Dara after earthquakes
Afghans walk past damaged houses, after earthquakes at Mazar Dara village in Nurgal district, Kunar province, in Eastern Afghanistan, on September 1, 2025. Image: Wakil Kohsar/AFP

A powerful earthquake in Afghanistan had killed at least 1,400 people and destroyed over 5,000 homes. The mountainous terrain makes rescue efforts difficult. As the death toll continues to climb, rescue workers are trying to reach survivors. The midnight earthquake was followed by several aftershocks. The UN has estimated that the earthquake might affect hundreds of thousands in total.

Meanwhile, online users are facing another challenge: fake and AI-generated images and videos, claiming to show the scale and aftermath of the disaster, have been circulating on various social media platforms, adding another tragic dimension to Sunday's natural disaster. DW fact checkers have debunked some of the most popular claims.

An AI-generated video posted on X — when you see a large group in a video, keep an eye out for individuals that appear to be merging or dissolvingImage: x

Fake and AI-generated videos

Claim: A videoposted on X claims to show a mass funeral, in which hundreds of bodies shrouded in white cloth are being carried to their final resting place. "This is not Gaza — this is Afghanistan's Kunar province, where more than 800 people have lost their lives and thousands have been injured in just one night due to a devastating earthquake," the caption reads. 

It further appeals for donations and provides several telephone numbers: "Anyone who wishes to help or donate can reach out to the following Afghan numbers…"

DW Fact check: Fake

This video is fake. To be more precise, it was generated with the help of artifical intelligence (AI). There are several obvious clues in the footage, such as the fact that there are no identifiable faces in the video. A few figures appear to be moving in the eight-second-long clip, but their movements appear unnatural, almost robotic. 

It also appears as if each body were being carried by two people, without the help of a cot or coffin. And several individuals appear to be walking away from the procession, or in the opposite direction as the crowd. Both instances seem nonsensical in this context. For the most part, the bodies being carried are rigidly horizontal, which looks strange.

Media reports from the earthquake-affected Kunar province show large-scale destruction, with many collapsed buildings. However, in the video, there are no signs of recent damage — the buildings in the background appear intact. Unsurprisingly, many users have already pointed out in the comments that they think the image is AI-generated.

A reverse image search led us to several earlier posts in which the same video was linked to Pakistan's floods on August 20. As Afghanistan deals with the aftermath of Sunday's massive earthquake, its neighbor Pakistan has been experiencing exceptionally severe flooding since late June.

Different versions circulating online

The video also appears on TikTok in a similar version. In this instance, the footage has a slightly higher resolution, and the glitches typical for AI are even more pronounced. After just three seconds, viewers can see individuals merging into one another or disappearing completely. This video was posted on August 19 — well before Sunday's earthquake — without any caption. In the background, there is a song playing with Urdu lyrics. 

Fake and AI-generated images have been circulating online following the earthquake in AfghanistanImage: x

In a similar video, another user wrote: "Behind every dead body, there's a shattered family." This is another eight-second-long clip, apparently showing the funeral of hundreds of people. Hashtags associate this video with the earthquake in Afghanistan. The video contains the same clues: immovable and unrecognizable faces, people walking in different directions, or merging with one another, and stiff white bundles that look identical. 

We took a screenshot of one of the frames and found that this video had also been shared earlier, in association with Pakistan's floods. One user posted the videowith a visible link to a UK-based charity organization. When contacted, the organization told us they were unaware of any such video being posted online.

Why short clips?

Both videos are only eight seconds long, and we were unable to find longer clips of either of them online. Many software tools now allow users to create short clips using artificial intelligence; however, at the maximum length of around 8 seconds. Hany Farid, a computer scientist at the University of California, Berkeley, who specializes in digital forensics, also pointed this out in one of his LinkedIn postsin June this year.

"One simple tip-off (for now at least) is that all of these videos are either exactly eight seconds in length or composed of short (eight seconds or less) clips composited together. Why eight seconds? This is the current maximum length that [the text-to-video generation tool] Veo 3 can generate a continuous shot. Other models have slightly longer limits, but 8-10 seconds is typical," Farid wrote.

Both videos assume a bird's eye view, as if the footage had been filmed by a drone. Even though this is possible, DW Fact check has observed this pattern often in AI-generated videos allegedly showing destroyed infrastructure in Syria, Gaza or Iran

In addition to these videos, AI-generated images are also circulating online, falsely claiming to show the destruction caused by the earthquake in Afghanistan. 

This AI-generated image falsely claims to show the destruction caused by the earthquake in AfghanistanImage: x

Claim: In a viral post on X (archived here), a user wrote that "Kunar province in Afghanistan has been severely impacted by a powerful earthquake, leaving residents injured and devastated." The user posted a picture in support of an appeal: "The community is in urgent need of immediate crisis response and assistance." 

DW Fact check: Fake

The scene in the image is cluttered with visual cues meant to foreshadow disaster: demolished houses, people standing outside, a fire burning in the rubble, with the moon partly hidden by clouds over the mountains. This alone is enough to set off alarm bells that the image could be generated with the help of AI. It's as if the picture were crafted with lots of time and attention to detail.

How to research whether an image is authentic?

A simple search of the image revealed its origin: "Made with Google AI."

There are also visible flaws in the image itself: shadows are cast on the same side as the light source, when they should fall in the opposite direction. The people in the image appear to be wearing winter clothes, and they look strikingly calm. When compared with genuine photos from the area, one can see that locals are not dressed in winter clothing.  

A reverse image search with Google Lens reveals that this image has been created with Google's own AI generatorImage: google.com search

Other online users have shared screenshots of AI detection tools, pointing out that this image was fake.  

As so often in breaking news situations, several mainstream media outlets also fell for this picture and used it in their reporting on Afghanistan's quake. The Indian newspaper The Indian Expressused this picture, as well as the Turkish state broadcaster TRT.

Edited by: Rachel Baig

Skip next section Explore more
Skip next section DW's Top Story

DW's Top Story

Skip next section More stories from DW