UK regulator Ofcom found a clear connection between online social media posts and increased violence on the streets during riots over the summer.

A review from the UK’s communications regulator has concluded there was a “clear connection” between online social media posts and violence on the streets during the Southport riots this summer. 

Melanie Dawes, chief executive of UK regulator Ofcom, said in an open letter that misinformation spread “almost immediately” after three children were killed in a stabbing at a dance studio in Southport, the UK, on July 29. 

The content proliferated despite an “uneven response” by some companies to limit the spread of hateful content on their platforms, Dawes continued.

“Posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period,” the letter said. 

Accounts, groups disseminate information to provoke violence

Some accounts posted and spread information that the attacker was a Muslim asylum seeker, and shared claims about his political views. 

Axel Rudakubana, who was born in the UK to Rwandan parents, was charged in connection to the stabbing.

Fake information about Rudakubana’s identity spread, even when there was evidence that the posts were being made to “stir up racial and religious hatred”. 

Others used groups to spread violent plans against a local mosque within hours of the vigils for the victims.

“Some of these groups disseminated material encouraging racial and religious hatred, and provoking violence and damage to people and property, including by identifying potential targets for damage or arson,” the letter continued. 

During the riots, platforms dealt with “high volumes” of misinformation posts, “reaching the tens of thousands in some cases,” the Ofcom letter says. 

New powers for UK regulator coming

A week after the stabbing, Ofcom reminded tech companies in an open letter that their platforms can be used to stir up hatred and provoke violence. 

The regulator’s statement also pointed out that social media companies would have new safety requirements under the UK’s new Online Safety Act – but that companies didn’t have to wait till the act’s final codes of practice were published to make online platforms safe.

The act asks tech companies to explicitly disclose how they protect users from illegal, harmful content and to have robust processes in place to take it down quickly. 

“I am confident that, had the draft codes been in force at the time, they would have provided a firm basis for urgent engagement with services on the steps they were taking to protect UK users from harm,” the letter reads. 

The findings from the Southport riots will be used to identify gaps in the current regulation, the letter continued, like the need to ask for more information about each platform’s crisis response protocols. 

Share.
Exit mobile version