Hate speech: social-media firms only a few breaches from regulation

Europe Letter: Facebook, Twitter and YouTube are hurrying to take down content

When EU justice ministers sit down to lunch on Friday in Luxembourg they will have an informal discussion about how the European Union should step up the fight against online hate speech – specifically, one of the great challenges of the new digital age: whether and, if so, how giant social-media organisations like Facebook can be brought into some form of regulation, voluntary or legally enforced.

The debate is part of an ongoing exchange about the digital market that ranges from ambitious proposals for putting fibre broadband in every town in the union to wider discussions about regulation of the storage of data.

The latter is of particular concern to Ireland, whose cautious perspective on regulation is very much moulded by the interests of the large digital companies based in Ireland, where huge amounts of data are stored. The price, we are being told by our European partners, for housing such profitable ventures is accepting the burden of regulation – and probably an enormous expansion in the remit of the Data Protection Commissioner. Of course, our real concern, we say, is preserving the freewheeling ways that are so much part of the success of this transformative industry. Ho-hum.

Some countries make it a criminal offence to challenge the veracity of the Holocaust. Do we want to go there?

Hate speech, a special case of the wider discussion, raises many of the same issues and difficulties, not least the balance between preserving free speech, including a right to be obnoxious, however offended recipients may feel, and the laudable aim of curbing speech that seriously inflames and may well lead to violence. The challenge of defining the limits is huge. Some countries, for example, make it a criminal offence to challenge the veracity of the Holocaust. Do we want to go there?

READ MORE

A year ago the European Commission and four major social-media platforms announced a code of conduct for countering illegal online hate speech, defined as the public incitement to violence or hatred on the basis of race, colour, religion, descent and national or ethnic origin, among other characteristics. It included a series of voluntary commitments by Facebook, Twitter, YouTube and Microsoft to combat the spread of such content in Europe, the most crucial of which was their willingness to move fast to take down content that breached their "community standards".

The commission has just reported on progress in 24 EU countries.

Facebook won praise for reviewing most complaints within a 24-hour target. It assessed notifications of hateful content in less than 24 hours in 58 per cent of cases, up from 50 per cent in December, according to the report. The others were less successful. Twitter speeded up its dealing with notifications, reviewing 39 per cent of them in less than a day, as opposed to 23.5 per cent in December. YouTube, on the other hand, slowed down, reviewing 42.6 per cent of notifications in less than 24 hours, down from 60.8 per cent in December.

Calling the results encouraging for the commission's push for self-regulation, the justice commissioner, Vera Jourova, said the proportion of offending items taken down had doubled and action was being taken more quickly.

Social-media groups can see what is coming down the tracks, and it is a lot more than voluntary self-regulation

The urgent imperative for engaging with the commission’s code for the social-media groups is clear: they can see what is coming down the tracks, and it is a lot more than voluntary self-regulation.

Last week EU ministers approved plans to force social networks to take measures to block videos with hateful content – although not, after objections of practicality from Ireland, among others, live-streaming. And the German government approved a plan in April to fine companies up to €50 million if they fail to remove hateful postings quickly.

There is a broad consensus that something has to be done. Other categories of online content on social-media platforms, from child pornography to extreme violence and terrorist propaganda, and even legally defamatory matter – all of which clearly breach community standards – will also require new approaches from both the industry and government.

Some will be regulatory – in Ireland Minister for Communications Denis Naughten is promising measures – but the industry could take a first step by embracing voluntary self-regulation as the press has done, in establishing the independent Press Council of Ireland. A creative expansion of its remit and mandate could fit the bill.

Patrick Smyth was until recently a national newspaper representative on the Press Council of Ireland