Brussels urges US social media sites to act quickly on hate posts

European Commission threatens tougher law to curb online hate speech and racism

Brussels is pressing US social media giants such as Twitter, Facebook, YouTube and Microsoft to escalate a voluntary clampdown against illegal hate speech and incitement to terrorism on their sites or face the prospect of new laws to curb online racist abuse.

The call from the European Commission for swifter action to delete racist posts and content from terrorists to promote online radicalisation follows a backlash against the proliferation of "fake" US election news on sites run by Facebook, Twitter and Google, which owns YouTube.

It comes amid alarm in countries such as Germany at the spread of online hate speech against refugees, which often echoes Nazi-era rhetoric against Jews. This has already led Heiko Maas, Germany’s justice minister, to threaten criminal liability for failing to delete racist posts.

US social media groups signed a “code of conduct” with Brussels in May that required them to “review the majority” of flagged hate speech within 24 hours, remove it if necessary and even develop “counter narratives” to confront the problem.

READ MORE

24 hours

But after 600 notifications to the companies of suspected hate speech in six months, the commission is not happy with progress. European justice ministers will discuss a report this week that finds companies are not removing “the majority” of notified illegal hate speech within 24 hours.

The report – for Vera Jourová, EU justice commissioner – found that 40 per cent of recorded cases were reviewed within 24 hours, but the figure rose above 80 per cent after 48 hours. Twitter was slowest to respond, while YouTube was fastest.

“If Facebook, YouTube, Twitter and Microsoft want to convince me and the ministers that the non-legislative approach can work, they will have to act quickly and make a strong effort in the coming months,” Ms Jourová said.

“The last weeks and months have shown that social media companies need to live up to their important role and take up their share of responsibility when it comes to phenomena like online radicalisation, illegal hate speech or fake news.”

The code of practice has been criticised by campaigners for online human rights. Joe McNamee, executive director at European Digital Rights, said EU legislation on hate speech fell below basic human rights standards in terms of clarity.

Feeble ‘code’

“Some member states don’t implement it properly, and those who do implement it in very diverse ways. Rather than legislating to solve the underlying problems with European law, the ‘code’ is a feeble effort to draw attention away from this unacceptable situation,” he said.

There was no response from any of the four companies to emailed queries from the FT.

Of the 600 cases notified, 316 were deemed to require a response from companies. Ms Jourová’s report said 163 items were deleted and 153 were not removed because the companies concluded there was no breach of legislation or community rules.

Ms Jourová believes targets can realistically be achieved within the code of conduct if companies step up their efforts. At the same time, the report found big differences in the rate at which racist posts were removed in member states. Removal rates were in excess of 50 per cent in Germany and France but as low as 4 per cent in Italy and 11 per cent in Austria. – (Copyright The Financial Times Limited 2016)