The Iran war has highlighted how producing videos with artificial intelligence (AI) can impact the public’s perception during periods of maximum news consumption, and as countries involved in the conflict aim to shape their own narrative as well.

But this phenomenon can have a highly emotional impact in countries that are part of the war, prompting their governments to take strict containment measures.

Easy and cheap access to AI video technologies has flooded social media with AI-fabricated deepfake videos and photos of combat, impact on civilian areas or statements since the start of the Iran war, fuelling disinformation which can have a significant impact on the perceptions of the war and the actual reality on the ground.

“Dramatic images and videos claiming to show real-time battle scenes and missile strikes are flooding social media feeds, spreading rapidly and misleading millions,” Marc Owen Jones, associate professor of media analytics at Northwestern University in Qatar, told Euronews Next about how the war is unfolding online.

The digital battlefield

Jones, who specialises in how social media, disinformation and online politics influence public opinion, said social media has become a battlefield for competing narratives in this conflict as all sides and their supporters are now using social media to win “hearts and minds”.

When it comes to the American side, Jones said “videos intercut with Hollywood clips, a sort of memeification of communication designed to appeal to a far-right aesthetic that rejects empathy in favour of humiliation”.

It also shows on the other side that “Iran has risen to the game, often mocking the United States with their memes, but a lot of AI-generated images appear to be exaggerating Iran’s military successes, arguably to add to pressure on Gulf states to put pressure for de-escalation,” he said.

AI-generated deepfakes

Advances in artificial intelligence are making misinformation easier and more convincing. AI tools can be used by anyone to create high-quality videos, images and audio in seconds.

Examples include videos claiming to show the US aircraft carrier USS Abraham Lincoln burning at sea. The videos were so convincing that President Donald Trump said he called his generals to verify whether the videos were real.

Trump later took to his Truth Social platform, saying: “Not only was it not burning, it was not even shot at, Iran knows better than to do that!”

Other examples include now-debunked videos claiming to show US troops crying and buildings in the Gulf cities being destroyed.

“The use of AI is legion and is increasingly hard to detect,” said Jones.

Speed and verification

The speed at which content spreads online makes it harder for ordinary people to verify what is real.

“In a fast-moving conflict, verified information is often delayed, which creates a vacuum that misinformation fills immediately,” says Jones. “When people are worried, they crave information, but that information is often false,” he added.

Unverified content can reach millions of people within minutes, and the public faces the difficult task of fact-checking content that is often highly realistic or shared across multiple sites.

Viral rumours

Alongside AI-generated battle footage, speculations circulated widely last week that Israeli Prime Minister Benjamin Netanyahu was dead.

Some users pointed out visual glitches in a low-quality video released by Netanyahu’s office on March 13. Users claimed that Netanyahu appeared to have six fingers on one hand, a telltale sign of AI use.

“Rumours that Netanyahu died were accompanied by accusations that his speech was actually an AI video,” said Jones.

Netanyahu later released several ‘proof-of-life-style’ videos to dispel rumours. However, speculation about his death persists online.

Bots and coordinated campaigns

Some content circulating online could be part of coordinated campaigns designed to deflect, persuade or influence public opinion.

“There are sketchy, anonymous accounts, with histories of multiple name changes, and no discernible identity sharing fake news and AI videos,” said Jones.

These accounts may appear credible, but they are often tied to state-backed actors or people seeking to profit from sensationalised content, he explained.

In some cases, automated accounts, or bots, boost certain narratives by sharing and commenting on posts, making them seem more widely popular than they are.

Spoof and satire

Not all AI videos are designed to be deceptive. Some videos are deliberately created as parody and satire.

These clips often mock or mimic world leaders, like Trump and Netanyahu, but can still be mistaken for real videos.

According to Jones, “AI-generated deepfakes have crossed a critical threshold, earlier tell-tale glitches have been eliminated, and this technology is now accessible to anyone with a smartphone.”

Examples circulating online include a video depicting Trump as Iran’s new supreme leader, as well as clips portraying Netanyahu as a malfunctioning robot or with multiple fingers.

Other videos show NATO members refusing to help President Trump unblock the Strait of Hormuz and Ukrainian President Volodymyr Zelenskyy arriving in the Gulf region with anti-drone technology, only to be taken out by a missile.

In rapidly changing conflicts, videos like these can take on a life of their own and quickly spread beyond their original context.

Erosion of trust

The growing amount of misleading information online is making it increasingly difficult for people to distinguish between fact and fiction.

“False information can spread up to ten times faster than accurate reporting on social media, and corrections are rarely as widely seen or believed as the original false claim,” said Jones.

“Outrage drives sharing before fact-checking can occur, which is exactly what bad actors exploit,” he said.

Jones says that dramatic footage should be treated with the same scepticism as unverified claims.

“The fact that it looks real is no longer sufficient evidence that it is,” he added.

As the conflict continues, so does the battle on social media, leaving ordinary people to navigate the complex mix of misinformation, satire, and manipulated content.

Share.
Exit mobile version