Either we start to disconnect our increasingly networked world or we risk daunting social, safety, security and privacy consequences, a leading computer security expert and author has warned.
In an expansive talk directly challenging widely held assumptions about the benefits of computing, networks and the internet, Bruce Schneier told a large audience at this year's RSA Security Conference in San Francisco that we were moving towards a networked world so complex that we would be unable to safely manage it or adequately grapple with inevitable disasters.
Schneier, who is always one of the most popular speakers at the event, which drew nearly 40,000 people this year, pinpoints what he calls vast “socio-technical systems” as the critical issue. He describes these as complex, interconnected social and technical systems.
Examples, he says, are as diverse as chip and pin authentication systems, online reputation systems, bitcoin infrastructure, evoting systems, police documents such as no-fly lists and social media networks.
While technology has long had a social dimension, such systems “are different in a computer world. You see more people, you see increased speed, more frequency, decreased cost.”
Threats emerge due to the ability of these systems to scale in many different dimensions, resulting in privacy and security threats such as ubiquitous surveillance systems and large-scale data collection.
Such systems also grow vulnerable over time: a single programmer can affect these systems years later due to, say, poorly written code created at the start.
They also have hidden and complicated interactions.
“We can’t see how they work. They work in ways we don’t understand. These are systems where algorithms replace people, where systems replace judgment,” Schneier says.
Some of them are very large-scale. Two billion people have email addresses, one billion are on Facebook and there are a billion credit cards are in circulation in the US.
“These are crazy numbers. I think we’ve really reached a phase-change due to such scale and scope,” Schneier says.
Socio-technical systems are so large and so ubiquitous “that computing ends up being the substrate of how we live”.
Consider our personal networks, wearable devices, smart homes, smart organisations, smart cities and digital governments.
“When these [kinds of] systems become interconnected, they no longer make up a web that you connect to. Together, they make a new world that you live in no matter where you go or what you do. This is more than the internet.”
He calls this agglomeration “the world-sized web”, a combination of several things: mobile, cloud computing, the internet of things (IoT), persistent computing, autonomy and artificial intelligence, “smart things that act on the world in a direct physical manner” which will ultimately have physical as well as profound social effects.
As we produce one big interconnected system, we have no idea how to secure “this massive world that we don’t know much about, but which knows everything about us and is increasingly observant, increasingly intelligent”.
And, that “final solution” of the past – turning our computers off or putting our phones away – “is not going to work any more. The stakes are getting higher.”
Due to the connected “things” aspect of the IoT, risks are growing increasingly complex. “When you work with things, it’s no longer about risks to data. There are actual risks to cars and property. These are real-world effects that are much greater, and algorithms can have lifelong effects on people and communities,” Schneier says.
Biases and errors in these large systems “have much more dramatic effects”.
He predicts escalating confrontations over rights, such as the right to tinker with systems or devices, versus the security risks of tinkering, or one’s right to fly a drone set against others’ privacy rights, or the sensitivity of an individual’s medical data versus the social benefit of collecting data for research.
Such a scenario will result in “security as an arms race”. Then “the way to think of technology in this context is as something that perturbs security, that changes the balance between attacker and defender”.
Modern trends
Three modern trends affect this arms race model, says Schneier.
First, power balances change as these systems become prevalent. New technologies can give power to those who previously have had little – such as to protesters or to small entrepreneurs utilising crowdfunding. But on the flip side, new reach is given to the “already powerful” such as governments and corporations, initially slow to adopt technologies but eventually able to leverage them at devastatingly large scale, such as state surveillance programmes, for example.
Second, “attackers have an advantage over defenders”. Attackers always have a first-mover advantage, and complex systems are harder to defend.
Third, catastrophic risks increase. “Attacks scale with technology,” says Schneier. “Fewer attackers can do more damage, thanks to technology. Security is a numbers game.”
The obvious counter is to enhance security, but Schneier notes that again, because of this growing complexity, it is “very hard technically and I’m not sure right now that we have the ability to do this. And I’m not sure we have the incentive. Go to Kickstarter and look at all the cool projects for the internet of things, and look at how much time any of them are spending on security.”
Security should, of course, be built in from the start, but “there’s a whole lot of work in getting it right the first time”. Meanwhile, the security industry has the model of “agile” security, focusing on fixing problems quickly.
“So we need to get it right the first time, and somehow quickly fix it, because we didn’t. I don’t know how we’re going to do that. I see some saying we’ll have to choose, but we don’t have that choice.”
Schneier sees these challenges and confusions “primarily as a policy problem” because of the huge impacts on society.
“We need to think about a new regulatory agency, a new way of thinking about these systems. A department of technology policy or something. Maybe we can argue that’s too broad, but all technology is computer technology today,” he says.
He concedes his proposal is not novel. “New technology has led to a lot of new government regulators,” as happened after the arrival of trains, cars, and radio.
He sees no alternative to regulation “because the alternative is ad hoc and piecemeal”.
Corporate power
People also need to recognise that markets will not solve these problems.
“Markets are inherently individual and profit-motivated. We need something to counterbalance corporate power right now. Of course, there are going to be issues . . . but governments are going to get involved in this, the risks are too great and the stakes are too high. The choice is smarter government involvement or dumber government involvement.”
And on the technical side? We’re currently “in a honeymoon stage of connectivity. It’s like governments and corporations are punch-drunk on data. They’re also thinking about connecting everything. We used to have ‘collect it all’, now we have ‘connect it all’.”
So it’s time to go the other way.
“I think we need to start disconnecting systems. If we can’t control it, we have to not build a world where everything is connected,” he says.
Schneier says we need to move to “more distributed systems, to smart failure states, to the ability to function offline. I think we’re going to reach a high water mark for connectivity – and that afterwards, we’re going to start making conscious decisions about what to connect.”
Nuclear power offers a model, he says, hitting a high water mark in the 1970s, but then, people realised plants were dangerous to build and maintain.
“We saw the risks. I think we’re also going to do that with data, with internet connections, with connectivity,” he says.
And we do need to start to disconnect, he says, because “while good laws and technologies in western democracies are a good second line of defence, they really can’t be our only line of defence”.
Echoing a recurring theme at the conference this year, he says that ultimately, policy-makers and technologists need to start talking and working together, even though more typically, they end up in opposition, as the current Apple v FBI court case demonstrates.
But too much is at stake for continued division.
“The world-sized web will change everything. There will be more real world consequences, more autonomy, more catastrophic risks and fewer off switches. And more power to the powerful. This world-sized web is less designed than created and it’s coming without any forethought, architecting or planning. And most of us are completely unaware of what we are building here.”