Social media companies will be fined if they fail to remove extremist content from their sites within an hour of being alerted by police, according to proposals under discussion in Brussels.
The European Union is planning to act after increasing irritation at the failure of companies such as Facebook and YouTube to remove extremist content.
Officials are working on legislation to be unveiled next month after criticisms by lawmakers across Europe about the failure of tech companies to swiftly remove terrorist and extremist content from their sites.
The issue is particularly keenly felt in Europe after a string of terrorist attacks in the last year with every one of them linked to “online terrorist content”, according to the senior EU official responsible for security, Julian King.
The pan-EU laws would apply to all websites regardless of size and include videos, audio, text and pictures, the commissioner for security told the Financial Times.
He said that the EU had not seen “enough progress” from the big tech firms and would “take stronger action in order to better protect our citizens” from terrorist material.
The new legislation is likely to mirror the EU’s current voluntary code that was updated in March to encourage platforms to remove offending material within an hour of it being flagged by law enforcement.
Law enforcement officials say that swift action is essential to prevent extremist content being disseminated widely. Home Office analysis showed that ISIS supporters used more than 400 different platforms in 2017. The majority of shares were within the first two hours of release.
_____________________
Read more:
Supporters of ex-minister Boris Johnson post anti-Muslim comments on Facebook page
Twitter muzzles conspiracy theorist Alex Jones for a week
_____________________
The commissioner’s comments come a month after an undercover documentary revealed that Facebook failed to remove an anti-Muslim video from the network after a company moderator said that immigrants were “less protected” than other people.
The documentary was screened after the company had promised to crack down on extremist content after intense pressure from European leaders including France’s Emmanuel Macron and the UK’s Theresa May to do more.
Germany already requires companies to remove content deemed illegal by its hate speech law within 24 hours, with fines of up to 50 million euros for those flouting the rules.
Facebook chief operating officer Sheryl Sandberg accepted in January that Facebook had to do better to stem the spread of hate speech. The company said it was investing in artificial intelligence and hiring up to 20,000 people by the end of 2018 to identify and remove harmful content.
Improvements in technology had allowed Facebook to remove 1.9 million pieces of ISIS and Al Qaeda material in the first quarter of 2018, according to the FT.
“Big social media companies have been warned for years about terrorist content on their sites. It has taken them a long time to make any meaningful advances,” said Robert Postings, who examined the role of social media sites in the plotting of terrorist attacks.
His report - Spiders of the Caliphate published in May by the Counter Extremism Project – found that ISIS networks were growing on Facebook and had the potential to plan and direct terror attacks as well as recruit new members.
"Regulation from the EU, and governments in general, is a good way to encourage them to counter this terrorist content more efficiently," he told The National.
The Counter Extremism Project found that in three months this year, YouTube hosted some 1,348 ISIS videos. They were seen more than 160,000 times, even though three-quarters were removed within two hours, it said.