Steve Jobs was not the only mastermind of computing to leave the stage in recent weeks
EVERYONE IS still talking about the passing of Steve Jobs but it seems that in the computer world, as elsewhere, deaths must come in threes. In the same month, two people less well known outside the tech world – Dennis Ritchie (70) and John McCarthy (84) – also passed away.
Between them spans an amazing breadth of intent, ambition and influence. Just 50 years ago, both created small projects that expanded to fill much of the space occupied by computers in our world.
At the end of the 1960s, Ritchie co-invented a computer language called C, and the Unix operating system. At the start of that decade, McCarthy had pioneered artificial intelligence and a language called Lisp. It could truly be said that much that has happened in the world of computers, theoretical and practical, came from the dance between those two languages, and the two philosophies that surrounded them.
It would take many volumes to document precisely those philosophies, and this column is at least ostensibly in the business section, so let me spell them out briefly and then explain their ramifications in our down-to-earth, bottom-line, tech-driven world.
McCarthy, in 1956, dreamed of computers that would be as smart as people, and convened the first conference on what he termed “artificial intelligence”. As part of the research to achieve just that, he sketched out, and implemented, a remarkably concise computer language that would allow both programs and data to be stored and manipulated in the same way.
McCarthy’s proposal, which became the language Lisp, was perhaps best understood as an attempt to grant computers the opportunity to represent their own “mental” states in code, and reprogram themselves on the fly. McCarthy wrote in a page of code a description of his language and wrapped it in an explanation which showed that practically any other computer task could be expressed in it.
Lisp’s purity and simplicity amazes computer scientists to this day, and gave hope to a generation that equally simple, perfect solutions could be found to providing computers with human-like intelligence and common science.
Dennis Ritchie was an engineer at Bell Labs, the research wing of ATT. His claim to fame was a skunkworks project at the firm to bring the complexity of the expanding world of computers down to a manageable size.
In response to the convoluted attempts to do many complex things at once, Ritchie and his colleague Brian Kernighan designed a minimal operating system and a practical, no-nonsense programming language, C, designed to create small programs that ran fast.
Twenty years later, in 1990, another pioneer, Richard Gabriel, described the gulf between these two worlds. In a talk, Worse is Better, he characterised the Ritchie model of computer programming as the New Jersey school. It prioritised simplicity over correctness, and completeness over consistency.
The New Jersey school was a deliberate contrast to the world founded by McCarthy and his colleagues in the 1950s, who were, Gabriel said, seeking solutions that did the “right thing”: an answer to computer problems that would be correct, complete and universal.
Gabriel, who was definitely in the Lisp camp, felt that while his artificial intelligence-seeking colleagues were technically right, they had lost out to the descendants of Ritchie, whose approach had better survival characteristics, because it got produced first, and could be improved in the real world instead of staying within the ivory towers of academia until perfected.
What is odd, in the twilight of these two pioneers, is how both their viewpoints have begun to merge and coalesce. The job done by Ritchie’s simple, practical Unix system was to bring computing to the whole world. Now we have a world where such technology is ubiquitous, and many of its developers have begun to use that luxury return to the purity and dreams of McCarthy.
To give an example, McCarthy spoke often about how we should build computer intelligence so that eventually the behaviour that humans most exhibit toward computers – which is to assume they have their own internal mental states – matched how they indeed do work.
Even when dealing with a device as simple as a thermostat, said McCarthy, we think of it as “wanting the room to be the right temperature”. We should build thermostats that really do want this goal, he said, and carry out all reasonable steps to achieve it.
This week, a company, Nest Labs, began marketing its Learning Thermostat. Its creators hark from Apple, a place built on Ritchie’s Unix and the strong desire to ship real code. Its advisers come from McCarthy’s MIT Artificial Intelligence Lab.
The point of their thermostat is that it should silently learn what you really want, rather than needing to be set. A thermostat with common sense.
As many have noted, it has been an amazing time to be alive – when the giants of computing were still walking the earth. As they leave, all of them – Jobs, McCarthy, Ritchie – seemed to have looked back with some disappointment at how few of their goals were achieved in their lifetime.
I won’t presume any of them would be pleased with a smart thermostat. But it’s a start on a road far longer than a single life, or three generations of geniuses.