Subscriber OnlyTechnology

Facial recognition: We’re tilting alarmingly towards normalising sneakier mass surveillance

We have yet to be given clear reasons why we need facial recognition technology even though a Bill relating to it is meandering through our political process

As the Government’s draft facial recognition technology (FRT) Bill meanders through the political process, serious global questions about this pervasive population surveillance tool continue to mount.

The draft Bill — officially, the Draft General Scheme of Garda Síochána (Recording Devices) (Amendment) Bill , to give it its full name — does little to answer any of them. Neither does one of the key European Union acts that allows FRT’s implementation, the recently-passed Artificial Intelligence Act.

That FRT is largely to be managed through an AI Act tells you much about why FRT should feature highly on the list of the ways in which the EU is tilting alarmingly towards normalising ever greater, more powerful and sneakier methods of mass surveillance.

FRT attempts to identify potential suspects, or spot the commission of a crime, by scanning and analysing stored video or live camera footage and matching a face to a database of potential suspects.

READ MORE

But the database is not a selection of suspects chosen on reasonable grounds they might have attempted or committed the crime in question.

The database is all of us.

Such methods would once have been discarded as too invasive, too expansive, too disproportionate and too undemocratic. And they may yet be, as legal challenges up to the European Court of Justice (ECJ) level are sure to come, as these technologies start to be used, and cases of wrongful arrest and unwonted surveillance begin to mount, as they most surely will.

With FRT, we automatically get the double-panopticon whammy of coupled-up FRT and AI digital surveillance.

We also get it without any strong evidence that scanning populations, storing huge databases of faces and searching through them to try to identify suspects is particularly successful, nor that any occasional success would outweigh the massive overstep of all of us effectively being stored in a digital police lineup comprising our sensitive biometric information.

Irish housing crisis: increased supply will not help affordability

Listen | 25:14

FRT joins other unacceptably invasive technologies and policies in my informal “summer of surveillance” series on how the EU and its states are increasingly attempting to introduce levels of population-spying that would have been anathema only years earlier.

Others include data retention (the collection and storage of everyone’s communications data) and proposals to make weakened digital “back doors” into widely used communications methods such as encrypted apps. Which of course, then become broken-encryption apps.

Claims around the accuracy of the technology have been regularly debunked, with many of the accuracy figures coming from entirely different uses matching two relatively clear images, not real-world footage.

In particular, it is well-established through numerous studies that FRT has a serious gender and race bias, reflecting problems with the sets of data it is trained on. If that sounds familiar, it’s because the exact same problems are endemic in AI, too, and FRT and AI are intertwined.

Unfortunately, it’s not just studies that demonstrate such bias. It’s real life and real people, almost always people of colour, having their lives turned upside down after being incorrectly matched by FRT to a crime.

Last month, I moderated an event for the Irish Council for Civil Liberties on FRT which featured Robert Williams, a black man from Detroit who arrived home from work to be wrongfully arrested on his driveway, in front of his wife and two small daughters and his neighbours, based on a false FRT match for a minor crime.

Williams spoke about the ongoing trauma to himself and his family — particularly his children — of his being taken away then held and repeatedly questioned by police for a day and a half before being released. His appalling experience was taken up by the American Civil Liberties Union (ACLU). Last Friday, he won his groundbreaking case against the Detroit Police Department. New procedures and protections will be put in place by the department, as a result.

When he was here he urged Ireland not to go down the same FRT route: “Here in Ireland, you have an opportunity not to introduce it in the first place. I hope your Government will listen to experiences like mine and think twice before bringing FRT into policing.”

He also noted wryly that Detroit could have put the tens of millions it spent on faulty FRT towards addressing the poverty that often pushes people towards crime.

In February, the Joint Oireachtas Committee on Justice raised major questions about the draft FRT Bill, calling for An Garda Síochána and the Department of Justice to clarify how the technology would be used and what databases of imagery it would draw upon. No answers have been forthcoming.

The only known Irish database of biometric information is the one gathered for the controversial Public Services Card. Yes, the card we were told would be used for nothing except social services, but then acquired so many additional uses that the Data Protection Commissioner got involved. It must not be sequestered for FRT.

The existing Irish Bill is vaguer than the EU AI Act’s provisions on FRT, also raising concerns. Williams’s case is a concrete example of what can go wrong with relying on FRT. We deserve clear answers from the State about exactly how it intends to use FRT — and why it’s needed at all.