news

How Does Innovation Really Happen?- Valutrics

The following is an excerpt from my new book, Mapping Innovation: A Playbook For Navigating a Disruptive Age.

On December 9th, 1968, a research project funded by the US Department of Defense launched a revolution. The focus was not a Cold War adversary or even a resource rich banana republic, but rather to “augment human intellect” and the man driving it was not a general, but a mild mannered engineer named Douglas Engelbart.

It’s hard to fully grasp what happened that day without understanding the context of the time. In those days, very few people ever saw a computer. They were, in large part, mysterious machines to be used only by a select priesthood who were conversant in the strange mathematical languages required to communicate with them. The tasks they performed were just as obscure, carrying out complex calculations for scientific experiments and managing mundane back-office tasks for large organizations.

But here was Engelbart, dressed in a short-sleeved white shirt and a thin black tie, standing in front of a 20-foot high screen and explaining in his low-key voice how “intellectual workers” could actually interact with computers. What’s more, he began to show them.

As he began to type a document on a simple keyboard, words started to appear, which he could then edit, rearrange and add graphics and sound to, while all the time navigating around the screen with a small device he called a “mouse.” Nobody had seen anything remotely like it ever before.

The presentation would prove to be so consequential that it is now called “The Mother of All Demos.” Two of those in attendance, Bob Taylor and Alan Kay would go on to further develop Engelbart’s ideas into the Alto, the first truly personal computer. Later, Steve Jobs would take many elements of the Alto to create the Macintosh.

So who deserves credit? Engelbart for coming up with the idea? Taylor and Kay for engineering solutions around it? Jobs for turning it all into a marketable product that created an impact on the world?

Maybe none of them.

Engelbart got the ideas that led to the “The Mother of All Demos” from Vannevar Bush’s famous essay, As We May Think, so maybe we should consider Bush the father of the personal computer.

But why stop there? After all, it was John von Neumann who invented the eponymous architecture that made modern computers possible. And that, in turn, relied on Alan Turing’s breakthrough concept of a “universal computer.”

Or maybe we should credit Robert Noyce and Jack Kilby for developing the microchip that powered the digital revolution? Or Bill Gates who built the company that made much of the software that allowed businesses to use computers productively?

The story doesn’t seem any clearer when we try to look at the events that led to modern computing as a linear sequence going forward. Turing never set out to invent a machine. He was, in fact, trying to solve a problem in mathematical logic, the question of whether all numbers are computable. He created his idea of a universal computer — now known as a Turing machine — to show that it was possible to create a device that could “compute all computable numbers,” but ironically in doing so he proved that all numbers are not computable. His work was an extension of Kurt Gödel’s famous incompleteness theorems, which showed that logical systems themselves were broken.

It was these two insights about the illogic of logical systems and the incomputability of numbers that led to the powerful logic of modern computers that we see all around us everyday. Confusing, to be sure.

The waters muddy even further when we try to gauge the impact of personal computing. We know that Xerox built the first Alto in 1973 and Apple launched the Macintosh, with great fanfare in 1984, but as late as 1987 the economist Robert Solow remarked, “You can see the computer age everywhere but in the productivity statistics.” And, in fact, economists didn’t start seeing any real economic impact from information technology until the late 90’s–nearly 30 years after “The Mother of All Demos.” So what happened in the interim?

It seems that any time we try to understand an innovation through events, the story only gets more tangled and bewildering.

And it doesn’t get any clearer if we look at the innovators themselves. Some were highly trained PhD’s, but others were college dropouts. Some were introverts. Others were extroverts. Some worked for the government, others in industry. Some worked in groups, but others largely alone.

Yet that brings us to any even more important question: How should we pursue innovation?