Subscriber OnlyOpinion

Fintan O’Toole: Why we didn’t think the cyber criminals were coming for us

It’s time for a government department devoted to counteracting inevitable threats

The pandemic and the cyberattack on the HSE have two things in common. One is that they were certain to happen some time. The other is that almost all of us assumed they wouldn’t happen any time soon – and they wouldn’t happen to us.

This contradiction is rooted in one of the facts of contemporary life: a huge and ever-growing gap between expertise on the one hand and the public realm on the other. This is not just a matter of the wilful ignorance of the new right, its deliberate assault on science, evidence and fact. It is more fundamental than that.

There is a paradox of technology: the smarter it is, the more its mass application depends on ignorance. Most new technologies start out as coterie obsessions. They are developed – by definition – by people who understand them. But in order to have general usefulness – and to be profitable as commodities – they have to be adopted by more and more people who have only the vaguest conception of how they work.

The very word that is used to describe this ease – intuitive – is almost a synonym for ignorance

Cars were developed by people who could take apart and reassemble a combustion engine. Planes were invented by nutters who could, and would, actually fly them. But neither would have transformed the world if things stayed like that. They had to be useful to the vast majority of people who barely know one end of an engine from another.

READ MORE

It’s the same with information technology – except that the knowledge gap is much, much more dangerous. If you don’t know how your car works, you get a mechanic to fix it. If you don’t know how IT (information technology) works, those who do have extraordinary power over your life.

Broadly speaking, societies have traditionally dealt with the knowledge gap by controlling the experts. This is done formally by the State – by ensuring that they have approved qualifications or are members of the right professional bodies.

It is also done by self-regulation. In return for prestige and a decent income, the experts police their own reputations through codes of ethics and international standards.

In extreme cases, both the law and professional bodies get involved. A rogue doctor can be both prosecuted and barred from practice by the Medical Council.

The problem with IT, and in particular with internet-based systems, is that these mechanisms don’t work. The knowledge gap obviously exists: I know how to operate the computer I am writing this on and I know how to put it into the Irish Times system. But I have only the foggiest notion of how this computer or that system work.

The seductive thing is that this gap in knowledge is fabulously convenient. It is not just that the systems are very complex. It is that an immense amount of money and ingenuity has gone into making them astonishingly easy to use.

The ‘intuitive’ paradox

The very word that is used to describe this ease – intuitive – is almost a synonym for ignorance. It means that you can enjoy the benefits without so much as having to read an instruction manual.

You can take a dizzyingly complicated machine out of a box, press a button, follow a few prompts and then just pick up anything you need to know as you go along. What you go along doing is communicating almost everything about yourself.

While you’re doing this, though, you are ceding control over more and more of your life to those who do know how the thing works and who can therefore make it work for them. And most of us – most certainly including me – are very naive about this transfer of power. This naivety extends from individuals to collective institutions, including governments.

The pandemic was coming. Climate change is coming. More and more cyberattacks on vital infrastructure are coming

The difficulty is that, ever since the great technological revolutions of the 19th century, we’ve been living with two assumptions. One is that the expertise will be acquired formally, in institutes or universities and will therefore be infused with collective values. The other is that its use can be regulated. The smart people will constrain themselves, and if they don’t, they can be constrained.

Cybercrime is taking over because neither of these assumptions works with IT.

Firstly, the knowledge needed to break into and control other people’s computers doesn’t have to be acquired formally. Indeed, the way much of it developed is a complete upending of traditional education: kids acquired the expertise before adults did.

In his brilliant book Dark Market, published 10 years ago, Misha Glenny showed how cybercrime was effectively developed by adolescent boys. They were not looking for money. They just wanted to be able to play computer games that their parents couldn't or wouldn't buy for them.

As far back as 1982, a German kid who used the handle MiCe! figured out that game developers were inserting specific bits of code to stop their products being copied. Using trial and error, he rewrote the code until he finally cracked it. These discoveries were shared with other kids, and a huge network of teenagers (overwhelmingly male) rapidly developed, each of them competing to be the first to crack a new game.

Grim game

If its ultimate results were not so grim, there would be something marvellous in this children’s revolt. But it carried with it a deadly germ – this form of knowledge has encoded within it the ethics of adolescent boys who spend most of their time on screens. It is so detached from human consequences that crippling health systems is just another thrilling game with lucrative prizes.

Secondly, of course, it is impossible to impose regulation on the use of the internet without universal transnational enforcement. We have the exact opposite. It is not just that there are states that turn a blind eye to cybercrimes against countries they don't like. It is that states themselves engage in cybercrime on a massive scale – Russia's hacking of the Democratic Party in the US being an obvious example.

These states don't even have to be rich and highly developed. The first big hack to become an international sensation – the huge theft and dumping of data from Sony Pictures in 2014 – was almost certainly the work of North Korea, a country where the vast majority of people have no reliable electricity supply, never mind internet access.

The rules governing the use of expertise have not changed for good – they have been abolished. Yet old assumptions die hard.

Politicians and public servants – but also I suspect an awful lot of private companies – find it easier not to think about the arrival in the world of people who can exploit without limit or conscience the gap between those who use technology and those who know it.

We have to start thinking differently. The pandemic was coming. Climate change is coming. More and more cyberattacks on vital infrastructure are coming. We need more than short-term fixes. For a start, we need a whole government department that does nothing else but think about, and plan to fight, the inevitable threats. The readiness is all.