Subscriber OnlyTechnology

Ireland’s unanswered questions on the Facebook scandal

Irish data watchdog's approach is under scrutiny after recent revelations


In November 2013 a new application went live on Facebook. Called "thisisyourdigitallife", the app was ostensibly a personality test but, in reality, was much more.

Downloaded about 270,000 times, it asked users questions and gave them answers about their personality. Behind the scenes, it also took the answers and merged them with Facebook profile data – the quid pro quo for using the app – to build psycho-graphic profiles.

In the background, the app’s tentacles were reaching even further.

Unless users intervened, Facebook’s rules at the time allowed all app developers to pull in or “scrape” the data not just of people using their app – but of all these users’ Facebook “friends”, even those who didn’t use the app itself.

READ MORE

The interlinked nature of the Facebook network meant that the personality test app scraped the data of 322 times more users than actually used it: an estimated 87 million Facebook users – including Mark Zuckerberg, as the Facebook founder and chief executive told the US Congress.

The genie was out of the bottle, and its data treasure trove ended up with UK-based consultancy Cambridge Analytica, and a company whistleblower claims now the data was used to target voters in the 2016 US presidential election.

Although this is a scandal with a US and UK focus, Europe’s traditionally tougher privacy laws did not prevent an estimated 2.77 million EU Facebook users being caught up in the illegal data dragnet.

This raises two questions.

First, did Facebook breach EU law by allowing apps to collect data in this way?

Second, what was the role in all this of the Irish Data Protection Commissioner (DPC), given current EU law hands it front-line responsibility for overseeing the Dublin-based Facebook International?

Latest twist

Finding answers is difficult because the Cambridge Analytica scandal is just the latest twist in a long-running saga involving Facebook and the DPC. It dates back to 2011 when Austrian privacy campaigner Max Schrems, in a broad complaint challenging the legality of Facebook's European data collection, flagged concerns over Facebook app data collection.

Concerns were raised by the DPC with Facebook, too, in two audits of the company operations in 2011 and 2012.

Facebook began to make smaller changes but it was 2014 before it tightened up sharing rules, meaning new apps could no longer access a user’s friends’ data without gaining explicit permission first. There were two caveats: the rules were not imposed retroactively, and Facebook granted pre-existing apps a one-year transition period. Thus the window for mass data collection closed for good in May 2015, four years after it was flagged by the 2011 Schrems complaint.

While the DPC waited for Facebook to act, “thisisyourdigitallife” had 13 months to scrape data of 87 million users, including in the European Union. But this is just the tip of the tip of the iceberg.

‘Shadow’ profiles

Facebook insists its primary focus is to connect people worldwide. Just as important, however, and very lucrative, is its business of selling advertising based on data it collects on users.

Facebook collects users’ data based on their interaction on the platform, but also follows them around the internet, logging where they go and what they do.

The company says it does this to serve more relevant advertising to users. But, by running users’ online activity inside Facebook and beyond through statistical models and artificial intelligence filters, Facebook can create huge “shadow” profiles far more detailed than many users realise.

Researchers claim that, by collating just 68 Facebook “likes”, they can predict – with more than 85 per cent accuracy – a user’s skin colour, sexual orientation and political affiliation, even if you have never shared this information with the social network.

The more likes, the closer Facebook comes to knowing you better than yourself. That is what makes its data so interesting to political campaigners.

No one besides Facebook knows its long-term data strategy.

But the Cambridge Analytica revelations give an indication: a company with access to this data can feed you advertisements tailor-made for your personality and prejudices. The efficacy of such targeted advertising is still an open question. But every time you hit the “like” button, or visit a website, you are revealing to Facebook ever new ways for your buttons to be pushed, possibly without even knowing you are being manipulated.

‘Game changer’

Even though it was only one app, the exponential nature of the data grab from “thisisyourdigitallife” is clear from the Irish example. Just 15 Facebook users in Ireland downloaded the app linked to Cambridge Analytica. But through linked “friends”, Facebook says the app – and Cambridge Analytica – had access to the data of up to 45,000 Irish people: the populations of Mullingar and Tralee combined.

With thousands of similar apps, and over 300 million users in Europe, it’s clear many millions of Europeans have been affected by this data breach via apps they never used directly.

UK data regulator Elizabeth Denham, whose office is investigating the activities of Cambridge Analytica, predicts recent revelations are “a game changer” in the once specialist topic of data protection.

“Suddenly everyone is paying attention,” she said last week in a speech in London. “The media, the public, parliament, the whole darn planet it seems.”

Changing social norms

Many regulators in Europe were wary of Facebook app data access. It featured in two DPC audits into Facebook’s adherence to EU data protection law. At its simplest, this requires collection and processing of EU users’ data to have a lawful basis and be proportionate.

Asked now about its 2011-2012 audits, the DPC says it found that “the collection of friend data in the unrestricted way it was happening was not proportionate in the circumstances and no adequate justification as to the proportionality was presented”.

"Unless a demonstration of proportionality could be made," it said in answers to questions from The Irish Times, "it would be contrary to EU data protection law."

What classifies as effective data protection supervision in Europe has long been a deeply contested question

Given app data collection was not proportionate, in the DPC view, was Facebook acting contrary to EU data protection law? The DPC says “there are no automatic and black and white answers to every question particularly where novel technologies are engaged”.

“All of these issues,” it adds, “also have to examine on an ongoing basis against a back-drop of evolving and changing social norms.”

Facebook argued it had a legitimate business interest to collect user data as it did, and share it with app developers. After repeated questioning it remains unclear whether the DPC agrees with this argument, or with Facebook critics who would like more active DPC enforcement.

“It is not all about enforcement action,” counters the DPC. “It is about effective supervision and outcomes for data subjects.”

Textbook failure

Just what classifies as effective data protection supervision in Europe, however, has long been a deeply contested question across the continent.

Although all regulators draw on one set of EU rules, balancing the legitimate interests of companies like Facebook to collect data as payment for “free” services and the fundamental European right of citizens to privacy has seen two camps emerge, drawing on their own historical experience and legal traditions.

The Irish DPC – and the UK ICO – favour a more informal, negotiated resolution process. Direct intervention and litigation, commissioner Helen Dixon told a Dáil committee this week, was "uncertain and also very lengthy".

She pointed to gradual, so-called “granular”, improvements made after 2011-2012 before the end of widespread app data sharing beginning in 2014.

German and Austrian regulators, at the opposite end of Europe’s privacy scale, view the app data sharing issue as a textbook failure to regulate until years after a concern was flagged.

Far from lengthy and uncertain, the Hamburg regulator took legal action against Google, forcing it to allow Germans an option to block their homes from appearing on its "Street View" service. The same regulator secured an injunction to stop Facebook merging into its databases user data from WhatsApp, the messaging service it bought, because such data sharing went against a promise WhatsApp had made to its users.

“This was challenged by Facebook before the administrative courts, twice, but we have won every court round,” said Prof Johannes Caspar, data protection regulator in Hamburg, where Facebook has its German base.

European data protection lawyer Prof Herwig Hofmann, involved in the Max Schrems case at the European court in Luxembourg, sees no problem with the Irish regulator’s tradition of negotiated solutions – except where there is a clear economic interest in one side stringing out proceedings.

“If there is a factory pipeline leak, an environmental regulator would tell the pipeline operator to stop the leak first and then negotiate,” he says, “and not to keep the pipeline running during negotiations”.

In many ways Europe’s ongoing data protection tensions are as much political and philosophical as legal.

Irish woman Orla Lynskey, an assistant law professor at the London School of Economics, says one problem is a lack of public awareness and education in Ireland on one’s own digital footprint – the data we all generate online.

That in turn, she suggests, has resulted in a lack of urgency over regulating big data. Only now have fears of election manipulation made potential risks of data collection clear.

“We turned a blind eye for years because it suited the Irish narrative of development and innovation, and because no one wanted to lose out on the benefits of big data companies,” says Dr Lynskey, who specialises in data protection and digital rights.

She suggests a good regulator will use all tools available – including swift decisions and robust sanctions.

“The DPC are not using the full array of options,” she says, “and this is having a damaging effect”.

The DPC disputes that it is litigation shy, pointing to its regular appearances in the Irish and European courts on the Schrems-Facebook case.

However, the Schrems legal team say the DPC ends up in court so regularly because of its systematic refusal to use powers at its disposal: to make legal findings, to intervene, and to impose timely remedies.

Hefty fines

Next month will see the DPC and other European regulators given even more powers – to investigate and impose hefty fines – as part of a new EU data protection rulebook (GDPR).

Facebook has taken out full-page newspaper advertisements welcoming the new rules, to the amusement of former EU commissioner Viviane Reding who remembers energetic efforts of lobbyists and governments to stop her proposals.

“Today they agree that it is an indispensable piece of legislation . . . and even a means to save democracy,” she said.

Ireland’s DPC says that, as front-line regulator for US tech giants based in Ireland like Facebook and Google, it will seize the initiative of the new rulebook to present new social media oversight proposals to fellow European regulators.

Others suggest this pro-active approach is motivated by a new oversight body, comprising all European regulators, which will be looking over Ireland’s shoulder from next month.

For all the fuss about Cambridge Analytica, it is a symptom of a wider malaise

“If Ireland had been robust in applying data protection rules,” says Dr Lynskey, “we would not be in a situation where we will have a new European supervisory board overseeing the DPC’s work, and able to issue binding decisions of its own”.

But Europe’s new data protection regime for the future does not resolve outstanding issues of the recent past, in particular whether Facebook app data collection in Europe broke EU law.

The social media company has a launched a “data abuse bounty”, rewarding people who report abuse of data by app developers, and are looking at “multiple ways to identify bad actors or breaches of our terms”.

“We can’t comment more specifically on this ongoing initiative,” the company says.

So how new is this concern for Facebook?

Facebook says it is conducting an investigation into data sharing by apps before 2014. Back in 2011 the DPC said in its audit report that “it is not possible for [Facebook International] to abrogate responsibility once the information is in the possession of the third party application and it does not seek to do so”.

Hamburg regulator Johannes Caspar says it is “laughable” for the company to claim now it was duped by unscrupulous developers. The data-sharing system it set up was ripe for exploitation, he says, and this was clear to the company for years.

“If you’re a supermarket owner, if you give the cashiers a day off and open the doors, you can’t be surprised when the shelves are cleared,” he said. “But a supermarket’s products belong to the owner. For Facebook, the products being sold are data of their users.”

Experienced senior data protection lawyers see a case for legal action against Facebook in Europe for allowing apps collect data from users who never gave specific consent.

“This was not done in the open internet but within the context of Facebook,” says Prof Hofmann. “And when Facebook allowed that, and made money from that, it becomes quite clear where the responsibility lies.”

So who is going to find out what happened? Dixon said this week her office was not responsible for targeted political advertising based on user data. But such targeting is made possible by data collecting and Facebook app developer policies.

Abortion referendum

Six years ago the DPC queried these policies’ ability to prevent illegal disclosure of data to third parties.

For all the fuss about Cambridge Analytica, it is a symptom of a wider malaise. No one knows yet whether Facebook data-sharing rules have put other EU data sets in circulation, with potential to influence democratic processes, from the Brexit vote to Ireland’s upcoming abortion referendum.

From April 25th, Facebook announced this week, it will roll out a pilot “view ads” project as an attempt to increase political advertisement transparency.

Finding out what happened, and moving beyond its audit process to a formal investigation of Facebook, would require the DPC to establish Facebook’s Dublin base as a data controller or controller/processor – legal terms giving the company corresponding obligations under EU law.

While Facebook confirms that its Irish subsidiary is “the data controller in relation to EU Facebook users’ personal data”, and regulated by the DPC, the Irish regulator says “we are making no finding at this time in relation to whether [Facebook] is a controller or a controller and a processor.”

Seven years after its first audit of Facebook, the Irish regulator says it has no plans for another audit but that the company is “under active examination”.

Asked how long this process will take, and what “active examination” involves, the DPC said “active examination is self-explanatory. When we complete the examination, we will then decide what further steps are appropriate”.

It has written to Facebook outlining action it requires, the DPC says.

Asked when this letter went out, and what deadline was imposed for action, the DPC replied: “We do not intend to disclose the full operational details of our supervision activities, as we do not see it as useful.”