40 years of Apple is a timeline for innovation and personal computing

History of the IT giant often affords too much significance to Jobs - and plays down the multiple missteps along the way

Apple has turned 40. There's something incongruous about the landmark, as if technology companies are not, really, supposed to reach middle age. That the company reached the milestone at all is something of a miracle given it was stuck in a state of perpetual crisis in its teenage years. That it reached 40 as the biggest and most influential technological company in the world rather than a scrappy underdog is astonishing.

In the popular imagination, those four decades represent the history of personal computing - the narrative arc from Steve Jobs and Steve Wozniak toiling on a home-made computer in a garage to the huge corporate behemoth that makes the glass and metal computers in our pockets encompasses the evolutionary timeline of popular computing, a sort of technological equivalent of the rise of Homo Sapiens from primitive ape to modern human.

There have been significant flops along the way, of course, but Apple’s highlights reel can effectively stand in for the history of personal computing in a way that no other company’s can: the breakthrough Apple II brought desktop computing mainstream; the Macintosh introduced a smiling face and graphical user interface; the iMac brought vibrant style to the previously dull world of PCs; the iPod put all our music in our pockets; the iPhone revolutionised mobile computing.

History of modern tech

READ MORE

The story of Apple and Steve Jobs, then, functions as a convenient narrative shorthand for the wider understanding of the history of modern technology. That narrative is an important one, of course, but it is misleading in many crucial ways. Most obviously, it affords far too much significance to Steve Jobs and Apple, and ignores the early Altair, downgrades Intel's critical achievements, airbrushes the importance of Xerox, elides IBM, reduces Microsoft to the role of rival, and so on.

But more subtly, it reduces the story of modern computing to a handful of Steve Jobs eureka moments, which fundamentally miscasts the nature of technological innovation. Rather than a series of sudden great leaps forward, technological progress is instead a process of constant iteration, with some breakthroughs and speed bumps along the way.

The British technology writer Charles Arthur recently came up with a useful metaphor to appreciate the process. "Remember the last time you took a bath or shower and it started lukewarm but you gradually warmed it by adding more hot water, until it reached a temperature so hot you could never have got into it at the start? Isn't it strange how we can be immune to subtle, slow changes all around us?" he wrote. "Then there's the other extreme - the ice bucket experience, where you're abruptly plunged into something so dramatically different you can't think of anything else."

The metaphor isn’t perfect - that’s not how lots of people take a bath, for one thing - but it is a rather persuasive way of highlighting how we perceive progress. We clearly remember the ice bucket moments - the Macintosh reveal, the arrival of the iPhone - while barely perceiving the increasing temperature in the bath - faster processors, more sophisticated graphics, cheaper and more ingenious sensors, all combining to power an array of life-changing devices.

Bill Gates famously pointed out that "Most people overestimate what they can do in one year and underestimate what they can do in 10 years". That insight can also be applied to what we expect of technology - we overestimate what can be achieved in the next 12 months, and yet rarely express amazement at what has been achieved in the past decade.

For instance, by the time artificial intelligence and self-driving cars have matured into mainstream technologies, they are likely to feel like a warm bath, astonishing achievements that seem a little underwhelming given their long gestation.

Rarely first to market

The metaphor also illuminates why Apple is widely perceived as the greatest of innovators. Cynics often point out how Apple is rarely first to any market, and more often than not isn’t even using entirely new technologies. While rival companies would be adding more and more hot water to the bath, Steve Jobs would judge his moment until the underlying technology matured and then produce a polished and well-executed device that seemed like a sudden breakthrough.

Jobs and by extension Apple blithely ignored the maxim that the customer is always right, in favour of an attitude that they know best, and the customer can take it or leave it. That highly opinionated stance is divisive, to say the least, and antagonises many people with its implicit arrogance. But the approach leads to occasional ice bucket moments.

It will be interesting to see whether the Apple Watch will eventually be included in that category - what 10 years ago would have appeared a magical, wondrous thing right now seems undercooked. Maybe the watch will be remembered as a latter-day Newton, Apple’s mid-90s PDA that was at least a decade ahead of its ambitions; maybe it will be seen as a turning point in wearable computing.

Forty years in, perhaps Apple’s greatest trick is making the warm bath feel like the ice bucket every once in a while. It’s not actually how technological progress happens, but they sure make it seem that way.