Police across Britain are braced for an outbreak of violence, triggered by the killings of three children at a summer club, but flamed by disinformation and an organised campaign by far-right agitators. Protests have been planned in at least a dozen <a href="https://www.thenationalnews.com/tags/uk" target="_blank">UK </a>towns and cities this weekend, while intelligence sources have told <i>The National</i> they expect a summer of riots to break out, raising memories of 2011 when London was gripped by looting and arson attacks. Experts have pointed at “inauthentic behaviour” online – co-ordinated attempts to spread misinformation and disinformation as widely as possible, which in turn can lead to agitation then violence. Misinformation can be defined as incorrect or misleading information, though not necessarily with malicious intent. Disinformation is seen as incorrect information that it deliberately deceptive. At least 15 “Enough is Enough” rallies have been advertised online over the next few days, calling on “patriots” to gather with England flags and demand an end to asylum seekers and migration. A number of the adverts included the phrases “save our kids” or “stop the boats”. Hundreds of mosques are strengthening their security and putting in place protective measures, while Home Office minister Lord Hanson told would-be rioters to “be prepared to face the full force of the law on this criminal activity”. <a href="https://www.thenationalnews.com/tags/keir-starmer" target="_blank">Prime Minister</a> Keir Starmer faces calls to combat the online forces that harness social unrest and act as a catalyst for violence. The protests capitalised on fake news accounts this week that claimed that a <a href="https://www.thenationalnews.com/tags/muslims" target="_blank">Muslim </a><a href="https://www.thenationalnews.com/tags/migrants" target="_blank">asylum seeker </a>was responsible for the knife attack on children in the northern town of Southport on Monday. The claims were spread by Russian state-owned media and some of the UK's influential far-right figures, among others. As the Southport community gathered for a vigil in memory of the three young girls who were killed in the attack, a far-right mob travelled to the town and set fire to a local mosque on Tuesday, leaving those trapped inside fearing they would burn down the building. Local courts lifted an anonymity order on the suspect the next day. Axel Radukabanu, a 17-year-old British boy of Rwandan heritage, was charged with murdering the three girls and also charged with 10 counts of attempted murder, after eight other children and two adults were seriously injured in the attack on Monday. But the unrest continued. One hundred arrests were made as a mob descended on Westminster throwing bricks and taunting police; a police car was set on fire in Hartlepool; objects were thrown at police outside a hotel housing asylum seekers in Manchester and riots broke out in Aldershot in some of the far-right gatherings across UK cities. Police in Nottinghamshire, Thames Valley, and South Yorkshire have issued statements saying they are aware of potential protests and in many instances have an increased high-visibility presence. In Northern Ireland, the PSNI said it is aware of calls “to block roads using women and children” and for a march to an Islamic centre. Mr Starmer announced new measures to crack down on any potential violence as fears of a descent into rioting takes hold across the country. These include more intelligence sharing between police forces, the use of facial recognition technology, and restricting the movements of known agitators – a tactic currently applied to British football hooligans. But he also faces calls to put more pressure on technology companies, after it emerged that fake news about the killer was harnessed by foreign states and online influencers to cause the ensuing chaos. Former prison governor Prof Ian Acheson, now a specialist at the Counter Extremism Project think tank, told <i>The National </i>the events show the need for stricter regulation of social media. “It is time that these catalysators of extremism and violence are held to account,” he said. It is not known precisely where the fake news about the Southport killer originated. However, a website called Channel3 Now, which purports to be a US-based news outlet but is actually Russia-owned, was among the first to relay it, shortly after a UK-based anti-lockdown activist. This was shared by Russian state-backed media and the accounts of influential agitators – including far-right activist Tommy Robinson and Andrew Tate. The leader of the UK's populist Reform Party, Nigel Farage, was also criticised for suggesting the police were covering up the killer’s identity. This prompted a warning from former MI6 chief Richard Dearlove that online fake news campaigns were a “fundamental tactic” of Russian President Vladimir Putin’s war against the West. “The exploitation of that space is a fundamental tactic in their conflict with the West. So, if these bots have been used to stir up through social media a violent response, I’m not the slightest bit surprised.” In one instance a TikTok account that began posting content only after the Southport attack, amassed more than 57,000 views for its posts relating to Southport “within hours”, according to Tech Against Terrorism, a UK non-profit which identified the account. This suggested that bot networks were actively promoting this material. Additionally, copies and screenshots of the videos were found circulating on other platforms such as X and Telegram, further amplifying their reach. Most of the account’s posts were calling for protests in Southport on July 30. Tech Against Terrorism said “inauthentic behaviour” online is a growing concern, with extremist and disinformation actors using sophisticated techniques to spread content across multiple platforms while concealing their identities. One such method involves setting up accounts as “beacons” to disseminate information widely. “This incident highlights the need for a national centre for open-source intelligence to analyse, share, and counter nation-state, terrorist, and extremist disinformation shared across the internet,” TAE's Adam Hadley told <i>The National.</i> “Given the possible role of foreign interference, we now need a concerted effort for platforms to identify and act against co-ordinated disinformation operations.” An investigation by Valent Projects, a UK start-up tracking online disinformation, warned the claim on Channel3 Now may have been a side effect of online revenue generation schemes, which it said are “little understood”. Such websites attempt to generate advertising revenue by using low-cost, automated content to drive as much traffic to their website as possible. “They are designed to maximise profit by operating with as little human intervention as possible, which means they likely don’t have staffers manually scouring the internet for popular content,” the company wrote on X alongside its finding. In the case of Channel3Now “it is highly likely an automated process fed off – and then inflamed – real world events.” Channel3Now has since removed the claim from its website and issued an apology. The fake news campaign was the “match on the tinderbox” for far-right groups to exploit the nation’s emotional fragility after the Southport stabbings, and trigger widespread rioting. “These protests are building on the narrative we have been witnessing for months from the far right,” said retired British Army major Andrew Fox, security analyst at the Henry Jackson Society think tank. “Southport gave them the opportunity to exploit it and to put a match on the tinderbox. There is only so much the government can do as the police are so short of manpower after the cuts made by the last government,” he said. The UK passed an Online Safety Act in 2023 to protect adults and children online, but experts say more is needed. Social media companies are given up to 48 hours to take down certain content – but the fake news about Southport had spread within hours. Platforms could introduce more immediate checks – such as detecting potential fake news and advising the user about it before they share their posts, according to Prof Harith Alani, director of the Open University’s Knowledge Media Institute. Better scrutiny of influential users who spread fake news for hateful purposes was needed, he added. “We forget the impact of hate speech,” he told <i>The National.</i> “A few incredibly popular accounts” had fuelled the violence, he said. “The question is why these accounts can operate unchallenged on social media, why the platforms are happy [to host] these accounts that can lead to such chaos,” he said. While the technology to detect such accounts was available, social media platforms were hesitant to police their users in this way, citing free speech. “The business model … is that you get compensated for being popular on social media, not for being accurate,” added Prof Alani. Social media companies had become increasingly guarded about sharing their data with external researchers, which makes building new tools to combat fake news more difficult. “We can produce far better algorithms [to detect and correct misinformation] but the platforms do not collaborate by giving us access to the data. They should be more collaborative,” said Prof Alani. Weaknesses in the fight against online disinformation today would be amplified when AI-generated content becomes more sophisticated. “With AI, the next wave of misinformation is going to be far more powerful in terms of convincing populations of false claims. Platforms governments and society really need to get ready for that,” he said. A more comprehensive approach was also needed so that communities step in quickly when a fake news campaign spreads. “It requires a whole of society approach,” said Joyce Hakmeh, deputy director of think tank Chatham House’s International Security Programme. “There’s a human side to the solution. Everyone has a role to play, and Big Tech is only a part of the puzzle.” Prof Matthew Feldman, a leading expert on right-wing extremism, praised the government’s quick policing response to the crisis but told <i>AP</i>: “I would urge them to go further and do something which has equal teeth and speed to be able to counter this tsunami of lies that have all too often spread on social media.” While far-right groups had “fanned the flames” online, the wider issue was how quickly misinformation was spreading across multiple platforms, he said. “These people were players in a larger story which is ultimately about disinformation and the way it can motivate people who are angry or hurt to take matters into their own hands,” said Prof Feldman. “Let us not forget within 30 hours of these lies being circulated on social media, 53 police officers were injured in Southport. That’s how quickly online misinformation can turn into offline harm.” Such street mobilisation has not been seen in the UK for nearly a decade. Unlike earlier protests, the recent unrest appeared to have no central movement co-ordinating events, he said.