More than 100 pieces of content published on X from Russian state media and disinformation actors that fail to comply with European rules still appear on the social media platform despite being reported to X, according to a new report.
The report commissioned by German non-profit group WeMove Europe, which was shared exclusively with Euronews, found “125 clear sanction-violating posts” on the Elon Musk-owned platform.
Some of the posts included programmes from the Russian state broadcaster Russia Today (RT), which has been banned by the European Union since the 2022 invasion of Ukraine.
In one instance, the Russian Ministry of Foreign Affairs account on X shared an excerpt from a documentary produced by RT, which shared a false narrative about an Adolf Hitler collaborator who was “elevated to the rank of national hero by the Kyiv regime”. The post also provided a link to bypass sanctions and access the full film on Telegram.
The researchers behind the paper had reported to X all the content deemed illegal under Europe’s Digital Services Act (DSA), the bloc’s digital transparency rules. The report found that only 57 per cent of the reports of illegal content received acknowledgement receipts, which breaches the DSA.
Among the 125 reports, only one post was removed by X. The company said that there was no violation of EU law.
In some cases, X responded to the researchers’ complaints within two minutes, the report said, suggesting that automation is playing a big role in X’s content moderation.
The European Commission, the EU’s executive body, launched a formal investigation into X this year for breaching the DSA and said it would finalise the investigation before the summer recess, which begins on July 25.
However, the Financial Times reported last week that the Commission will miss this deadline as it aims to conclude trade talks with the United States.
Euronews Next has contacted the Commission for comment about the latest report and the X investigation but did not receive a reply at the time of publication.
In January, French prosecutors also launched an investigation following allegations that X’s algorithm was being used for the purposes of foreign interference.
The researchers filed the reports to X on July 8 and 9, 2025. While they were met with automatic replies in most cases that X would look into their complaints, in the majority of them, they did not hear back.
Euronews Next has also contacted X for comment about the report but did not receive a reply at the time of publication.
Russia’s online war
Under the EU sanctions regime, it is prohibited to offer content hosting services for sanctioned entities, such as broadcasters or sanctioned individuals.
Russia has intensified its disinformation campaign in Europe since it invaded Ukraine in 2022, the researchers said.
Along with official Russian government accounts spreading fake news, there were also accounts likely operated by the “Social Design Agency,” a Russian company known to produce Russia’s influence campaign “Operation Doppelgänger,” as well as anonymous users repeatedly posting such material.
“Overall, the volume indeed exploded [since the war]. It’s much more significant,” said Charles Terroille, a project and investigative research officer at the fact-checking group Science Feedback who worked on the paper.
“A lot of the posts that we flagged to X are for instance, documentaries, if you can call them that, so 40-minute videos hosted on X that are Russia Today showing, for example, how Ukraine deserved it all or how [President] Zelenskyy and all the government people and officials in Ukraine are just fans of Nazi figures and all these widely false and reported stories that Russia is propping,” he told Euronews Next.
Terroille said another Russian method is to fabricate pages that look like well-known Western media outlets and spread them on X.
He added that fake news about public health and misinformation about COVID-19 vaccines are still running, among other topics, such as misinformation about the environment and are “absolutely weaponised” by Russia.
For Taïme Smit Pellure, a digital campaigner at WeMove Europe who led the report, the most shocking part of the research was that the content is also translated into people’s home languages, such as French, and is “everywhere” on X.
She told Euronews Next that both the Commission and X should be acting faster and that her organisation has reached out to the Commission but it has not had a “positive response yet”.
“We know they are working on this, we know they are, it’s not like they’re looking away completely, just taking their time because they want to do it right and we completely understand that,” she said.
Another recommendation for the Commission is for European governments to work together in a more coordinated manner to address this issue, said Saman Nazari, lead researcher of civic campaign group Alliance4Europe, who also worked on the paper.
“As long as we stay only working in our own little bubbles, we do not stand a chance against a multi-billion euro influence apparatus,” Nazari said.
His recommendation for X is that “there is not that much nuance. It’s straight-up illegal content,” and that “it doesn’t take much time” to find such content and address it.
“This is incredibly low-hanging fruit,” he added.