The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution

By Walter Isaacson,

Simon & Schuster, 560 pages, $35

If you have found yourself slipping into dark thoughts recently about whether a robot is going to take your job or online monitoring will lead to a surveillance state, then this tour d’horizon of the computer age is for you. It presents a deeply comforting, humanistic vision: of how a succession of brilliant individuals, often working together in mutually supportive groups, built on each others’ ideas to create a pervasive digital culture in which man and machine live together in amicable symbiosis.

So it is a shame that Walter Isaacson’s “The Innovators” ends at about the turn of the century, with the triumph of the PC and the world wide web. The subsequent rise of big data, giant computing “clouds” and National Security Agency surveillance points to a future in which technology’s ability to set mankind free is far from guaranteed.

Ada Lovelace (1815-52), daughter of the poet Lord Byron, is the guiding spirit of this book. Though credited in computing circles, somewhat hyperbolically, with being the world’s first programmer — more than 100 years before the first computer was actually built — Lovelace seldom gets wider recognition. Inspired by Charles Babbage’s design for what he called an “analytical engine”, which was never built, she conceived of algorithms that would make the machine capable of many different tasks depending on the routines it was asked to perform. For Lovelace, computing was a marriage of science and art that could never take the place of humans, and Isaacson in turn judges the contributions of his “hackers, geniuses and geeks” according to their fidelity to these founding ideals.

In his bestselling 2011 biography of Steve Jobs, Isaacson had the freedom to describe a complex and compelling character in depth. Here, space constraints necessitate a more succinct treatment of the many inventors of computers, software, microprocessors, the internet, PCs and much more. Yet even for readers who know most parts of this familiar story, reading it in its entirety provides a fresh perspective on the birth of the information age.

Unlike Jobs, who was in equal parts brilliant and obnoxious, those singled out for special mention by Isaacson freely share both ideas and credit for their discoveries. The writer is at pains to stress that innovation often works best when combined with a selfless, highly collaborative approach.

Jack Kilby and Robert Noyce are typical of these men (apart from in the very early days of computing, there are few women). Each separately came up with the idea of the microchip, but was willing to defer to the other. In Isaacson’s simplistic formulation: “They were both decent people: they came from tight-knit small communities in the Midwest and were well grounded.”

Growing up as tinkerers from small towns, many of the pioneers found their way into the scientific brain trusts of the US military industrial complex. Their work was shaped and steered by the bigger vision of men such as Vannevar Bush and J.C.R. Licklider, who saw with extraordinary clarity the future of personal computing and the internet before the first hulking computers were even built.

For all the importance of teamwork, however, it is the moments of solitary invention, the outsized characters and the clashes of giant egos that usually provide the most compelling fodder for scientific histories. Fortunately, Isaacson makes enough space in his technology morality tale for some of these. They include the catastrophically divisive William Shockley, a central figure in the development of the transistor, whose compulsion to claim credit for the work of others led to the defection of his entire team and eventually the formation of Intel. Most tragic is the story of Alan Turing, the British cryptographer and computer scientist who eventually killed himself after being chemically castrated for homosexuality.

There are also the occasional brilliant loners, hucksters and showmen. They include characters such as Doug Engelbart, who came up with essentially all of the technologies for the modern PC two decades before the first one hit the market, and Stewart Brand, the futurist impresario who conducted early experiments into LSD and helped to inject the hippie ethic into Silicon Valley culture.

Much of the history here is stitched together from other work, particularly in the early chapters. But Isaacson also makes use of many interviews with the central figures he describes, enabling him to take on some of the occasionally apocryphal stories that litter the commonly accepted history. These include the question of whether Al Gore really deserves the ridicule he often gets for claiming a central role in the development of the internet (no, according to Isaacson); whether the internet was conceived as a network capable of withstanding nuclear attack (yes and no, depending on whether you talk to the people who paid for it or who actually built it); and whether, if the web had included a protocol to handle micropayments at the outset, paid-for journalism would have thrived online (yes).

Through it all, Isaacson has a single overriding belief: that computers and the internet have enhanced and augmented human experience. The shadowy villains, consigned to a walk-on part, are champions of artificial intelligence, who see machines as in some sense replacing people. Tellingly, John McCarthy, who organised the first AI conference, is shown committing the kind of act that ranks as a sin in Isaacson’s moral universe: he is said to have resisted connecting his university’s computer to the first version of the internet to selfishly hold back resources for himself.

Ada Lovelace’s true heirs, the author intimates, are the web pioneers who brought mankind together online, starting with Tim Berners-Lee, who made no attempt to keep any rights to his historic invention. They include Jimmy Wales of Wikipedia; Evan Williams, who was behind one of the first blogging platforms and co-founded Twitter; and Larry Page, who wrote the first Google algorithm that brought order to the chaos of the web.

The PC and the web, however, are giving way to a new digital reality, and it is one that looks far more like the world of centralised computing power that came before. Information is amassed in vast repositories. Processing is concentrated in utility-grade facilities that are cutting costs dramatically — though at the expense of putting more power into far fewer hands. Control in this new age will reside with giant corporations and governments — all of them, no doubt, vowing to put their users first and make the world a better place.

Leaping briefly ahead at the end, Isaacson tackles artificial intelligence head-on. Smarter-than-human machines that will one day advance far beyond the capabilities of their creators do not figure in his predictions. Rather, he believes that mankind’s digital tools will always play a supporting role, augmenting rather than replacing their makers. It is a comforting vision — but one that could well be challenged by future generations of the innovators he celebrates here.

–Financial Times