What We Can Learn from Innovators
On innovation:
We talk so much about innovation these days that it has become a buzzword, drained of clear meaning. So in this book I set out to report on how innovation actually happens in the real world. How did the most imaginative innovators of our time turn disruptive ideas into realities? I focused on a dozen or so of the most significant breakthroughs of the digital age and the people who made them. What ingredients produced their creative leaps? What skills and traits proved most useful? How did they think and lead and collaborate? Why did some succeed and others fail?
On the importance of teamwork to innovation:
We don’t often focus on how central teamwork is to innovation. There are a profusion of books celebrating people we biographers portray, or mythologize, as lone inventors. I’ve produced a few myself. But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology revolution was fashioned. It can also be more interesting.
Creativity is a collaborative process. Innovation comes from teams more often than from the light-bulb moments of lone geniuses. This was true of every era of innovation. The Scientific Revolution, the Enlightenment, and the Industrial Revolution all had their institutions for collaborative work and their networks for sharing ideas. But to an even greater extent, this has been true of the digital age. The inventors of the Internet achieved most of their advances through teamwork.
On collaboration between generations:
The digital age may seem revolutionary, but it was based on expanding the ideas handed down from previous generations. The collaboration was not merely among contemporaries, but also between generations. The best innovators were those who understood the trajectory of technological change and took the baton from innovators who preceded them. Steve Jobs built on the work of Alan Kay who built on Doug Engelbart who built on J.C.R. Licklider and Vannevar Bush. When Howard Aiken was devising his digital computer at Harvard, he was inspired by a fragment of Babbage’s Difference Engine that he found, and he made his crewmembers read Ada Lovelace’s notes.
On how this book came to be:
I can recall the excitement that each new advance of the digital revolution caused for me. My father and uncles were electrical engineers, and like many of the characters in this book I grew up with a basement workshop that had circuit boards to be soldered, radios to be opened, tubes to be tested, and boxes of transistors and resistors to be sorted and deployed. As an electronics geek who loved Heathkits and ham radios (WA5JTP), I can remember when vacuum tubes gave way to transistors. At college, I learned programming using punch cards and recall when the agony of batch processing was replaced by the ecstacy of hands-on interaction. In the 1980s I thrilled to the static and screech that modems made when they opened for you the weirdly magical realm of online services and bulletin boards, and in the early 1990s I helped to run a digital division at Time and Time Warner that launched new Web and broadband Internet services.
I began work on this book more than a decade ago. It grew out of my fascination with the digital age advances I had witnessed and also from my biography of Benjamin Franklin, who was an innovator, inventor, publisher, postal service pioneer, and all-around information technologist and entrepreneur. I wanted to step away from doing biographies, which tend to emphasize the role of singular individuals, and once again do a book like The Wise Men, which I had co-authored with a colleague about the creative teamwork of six friends who shaped America’s cold war policies. My initial plan was to focus on the teams that invented the Internet. But when I interviewed Bill Gates, he convinced me that the simultaneous emergence of the Internet and the personal computer made for a richer tale. I put this book on hold early in 2009, when I began working on a biography of Steve Jobs. But his story reinforced my interest in how the development of the Internet and personal computers intertwined. So as soon as it was finished, I went back to work on this tale of digital age innovators.
On who turned him on to Ada Lovelace:
My daughter.
On the connection of the arts and the sciences, of the humanities to technology:
The most creative innovations of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered. “I always thought of myself as a humanities person as a kid, but I liked electronics,” Steve Jobs told me when I embarked on his biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story.
Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar, and his drawing of the Vitruvian Man became the symbol, of the creativity that flourishes when the humanities and sciences interact. When Einstein got stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres.
On crowdsourcing parts of this book:
I tried something different for this book: crowdsourcing suggestions and corrections on many of the chapters. This isn’t a new thing. Sending around papers for comments is one reason why the Royal Society was created in London in 1660 and why Benjamin Franklin founded the American Philosophical Society. At Time magazine, we had a practice of sending story drafts to all bureaus for their “Comments and Corrections,” which was very useful. In the past, I’ve sent parts of my drafts to dozens of people I knew. By using the Internet, I could solicit comments and corrections from thousands of people I didn’t know.
This seemed fitting, because facilitating the collaborative process was one reason the Internet was created. One night when I was writing about that, I realized that I should try using the Internet for this original purpose. It would, I hoped, both improve my drafts and allow me to understand better how today's Internet-based tools (compared to Usenet and the old bulletin board systems) facilitate collaboration.
I experimented on many sites. The best, it turned out, was Medium, which was invented by Ev Williams, a character in this book. One excerpt was read by 18,200 people in its first week online. That’s approximately 18,170 more draft readers than I’ve ever had in the past. Scores of readers posted comments, and hundreds sent me e-mails. This led to many changes and additions as well an entirely new section on Dan Bricklin and VisiCalc.
On artificial intelligence:
I was struck that the quest for artificial intelligence – machines that think on their own – has consistently proved less fruitful than creating ways to forge a partnership or symbiosis between people and machines.
Computers can do some of the toughest tasks in the world (assessing billions of possible chess positions, finding correlations in hundreds of Wikipedia-sized information repositories), but they cannot perform some of the tasks that seem most simple to us mere humans. Ask Google a hard question like, “What is the depth of the Red Sea?” and it will instantly respond 7,254 feet, something even your smartest friends don’t know. Ask it an easy one like, “Can a crocodile play basketball?” and it will have no clue, even though a toddler could tell you, after a bit of giggling.
“The Analytical Engine has no pretensions whatever to originate anything,” Ada Lovelace declared. “It can do whatever we know how to order it to perform.” In her mind, machines would not replace humans but instead become their partners. What humans would bring to this relationship, she said, was originality and creativity. This was the idea behind an alternative to the quest for pure artificial intelligence: pursuing instead the augmented intelligence that occurs when machines become partners with people. The strategy of combining computer and human capabilities, of creating a human-computer symbiosis, turned out to be more fruitful than the pursuit of machines that could think on their own. Artificial intelligence need not be the holy grail of computing. The goal instead could be to find ways to optimize the collaboration between human and machine capabilities – to let the machines do what they do best and have them let us do what we do best.
On collaborative spaces:
The most productive teams were those that brought together, usually in close physical proximity, people with a wide array of specialties. Bell Labs was a classic example. In its long corridors in suburban New Jersey, there were theoretical physicists, experimentalists, material scientists, engineers, a few businessmen, and even some pole-climbers with grease under their fingernails. Walter Brattain, an experimentalist, and John Bardeen, a theorist, shared a workspace, like a librettist and composer sharing a piano bench, so they could perform a call and response all day about how to manipulate silicon to make what became the first transistor.
Even though the Internet provided a tool for virtual and distant collaborations, a lesson in this book is that, now as in the past, physical proximity is still often desirable. The founders of Intel created a sprawling team-oriented open workspace where employees from Noyce on down all rubbed against each other. It was a model that became common in Silicon Valley. Predictions that digital tools would allow workers to telecommute were never fully realized. When Steve Jobs designed a new headquarters for Pixar, he obsessed over ways to structure the atrium, and even locate the bathrooms, so that serendipitous personal encounters would occur. Among his last creations was the plan for Apple’s new signature headquarters, a circle with rings of open workspaces surrounding a central courtyard.
On forming good teams:
Throughout history the best leadership has come from teams that that combined people who had complementary talents. That was the case with the founding of the United States. The leaders included an icon of rectitude, George Washington; brilliant thinkers such as Jefferson and Madison; men of vision and passion, including Samuel and John Adams; and a sage conciliator, Benjamin Franklin. Likewise, the founders of the ARPANET included visionaries such as Licklider, crisp decision-making engineers such as Larry Roberts, politically adroit people handlers such as Bob Taylor, and collaborative oarsmen such as Steve Crocker and Vint Cerf.
Another key to fielding a great team was pairing visionaries, who can generate ideas, with operating managers, who can execute them. Visions without execution are hallucinations. Robert Noyce and Gordon Moore were both visionaries, which is why it was important that their first hire at Intel was Andy Grove, who knew how to impose crisp management procedures, force people to focus, and get things done.
Visionaries who lack such teams around them often go down in history as merely footnotes. There is a lingering historical debate over who most deserves to be dubbed the inventor of the electronic digital computer: John Atanasoff, a professor who worked almost alone at Iowa State, or the team led by John Mauchly and Presper Eckert at the University of Pennsylvania. In this book I give more credit to members of the latter group, partly because they were able to get their machine – ENIAC – up and running and solving problems. They did so with the help of dozens of engineers and mechanics plus a cadre of women who handled programming duties. Atanasoff’s machine, by contrast, ended up not fully working, partly because there was no one to help him figure out how to make his punch card burner operate. It ended up being consigned to a basement, then discarded when no one could remember exactly what it was.
On the important role of women as the first programmers:
Starting with Charles Babbage, the men who invented computers focused primarily on the hardware. But the women who became involved during World War II saw early on the importance of programming, just as Ada Lovelace had. They developed ways to code the instructions that told the hardware what sequence of operations to perform. In this software lay the magic formulas that could transform the machines in wondrous ways. The most colorful programming pioneer was a spirited and feisty, yet also charming and collegial, naval officer named Grace Hopper.
The engineers who built ENIAC’s hardware were all men. But less heralded by history was a group of women, six in particular, who turned out to be almost as important in the development of modern computing. As ENIAC was being constructed at Penn in 1945, it was thought that it would perform a specific set of calculations over and over – such as determining a missile’s trajectory using different variables. But the end of the war meant that the machine was needed for many other types of calculations – sonic waves, weather patterns, and the explosive power of new types of atom bombs – that would require it to be reprogrammed often.
This entailed switching around by hand ENIAC’s rat’s nest of cables and resetting its switches. At first, the programming seemed to be a routine, perhaps even menial, task – which may have been why it was relegated to women, who back then were not thought of as engineers. But what the women of ENIAC soon showed, and the men later came to understand, was that the programming of a computer could be just as significant as the design of its hardware. The tale of Jean Jennings is illustrative of the early women computer programmers.
On the creation of the Internet and its predecessor:
Like the computer, the ARPANET and Internet were designed by collaborative teams. Decisions were made through a process, coordinated by a deferential graduate student, of sending around proposals as “requests for comments.” That led to a web-like packet-switched network, with no central authority or hubs, in which power was fully distributed to every one of the nodes, each having the ability to create and share content and route around attempts to impose controls. A collaborative process thus produced a system designed to facilitate collaboration. The Internet was imprinted with the DNA of its creation.
There was another contributor to the Internet’s DNA: it was funded by people in the Pentagon and Congress who wanted a communications system that could survive a nuclear attack. The ARPA researchers never shared or even knew about that goal; many were avoiding the draft at the time. This led to a sweet irony: a system funded partly to facilitate command and control ended up undermining central authority. The street finds its own uses for things.
On collective collaboration:
The Internet facilitated not only collaboration within teams but also among crowds of people who didn’t know each other. This is the advance that is closest to being revolutionary. Networks for collaboration have existed ever since the Persians and Assyrians invented postal systems. But never before has it been easy to solicit and collate contributions from thousands or millions of unknown collaborators. This led to innovative systems – Google page ranks, Wikipedia entries, the Firefox browser, the GNU/Linux software – based on the collective wisdom of crowds.
On leadership:
The most successful endeavors in the digital age were those run by leaders who fostered collaboration while also providing a clear vision. Too often these are seen as conflicting traits: a leader is either very inclusive or a passionate visionary. But the best leaders could be both. Robert Noyce was a good example. He and Gordon Moore drove Intel forward based on a sharp vision of where semiconductor technology was heading, and they both were collegial and non-authoritarian to a fault. Even Steve Jobs and Bill Gates, with all of their prickly intensity, knew how to build strong teams around them and inspire loyalty.
Brilliant individuals who could not collaborate tended to fail. William Shockley’s transistor company disintegrated. Similarly, collaborative groups that lacked passionate and wilful visionaries also failed. Bell Labs, after inventing the transistor, went adrift. So did Apple after Steve Jobs was ousted in 1985.
Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off,” Steve Jobs said. “It happened at Apple when Sculley came in, which was my fault, and it happened when Ballmer took over at Microsoft.” Larry Page felt the same. “The best leaders are those with the deepest understanding of the engineering and product design,” he said.
On growing up in the days of ham radios:
Many of these leaders had an advantage, when they were young, that few kids have today. They got to tinker with ham radios and circuit boards, learn how to solder and sort tubes and transistors. Back then, electronic devices could be opened up, unlike today’s laptops and iPads. Nowadays many kids grow up learning how to code, and for them Ruby and Python and JavaScript are as familiar as Latin and French were to previous generations. That’s good. Few of them, however, grow up understanding schematics of circuits or feeling empowered to jack into their electronic equipment – or even change the battery. It might be useful, in the age of sealed gizmos and tiny microprocessors, to find a way to restore the hands-on thrill that produced a generation of hardware hackers and hobbyists.
On social use of technology:
One lesson of the digital age is as old as Aristotle: man is a social animal. Almost every digital tool, whether designed for it or not, was commandeered by humans for a social purpose – to create communities, facilitate communication, collaborate on projects, and enable social networking. Even the personal computer, which was originally embraced as a tool for individual creativity, inevitably led to the rise of modems, online services, and eventually Facebook, Flickr, and Foursquare.
On the need for both the humanities and for the sciences:
We humans can remain relevant in an era of cognitive computing because we are able to think different, something that an algorithm, almost by definition, can’t master. We possess an imagination that, as Ada Lovelace said, “brings together things, facts, ideas, conceptions in new, original, endless, ever-varying combinations.” We discern patterns and appreciate their beauty. We weave information into narratives. We are storytelling animals as well as social ones.
Human creativity involves values, intentions, aesthetic judgments, social emotions, and personal consciousness. These are what the arts and humanities teach us – and why those realms are as valuable a part of education as science, technology, engineering, and math. If we humans are to uphold our end of the man-machine symbiosis, if we are to retain a role as partners with our machines, we must continue to nurture the wellsprings of our creativity. That is what we bring to the party.
The converse, however, is also true. People who love the arts and humanities should endeavor to appreciate the beauties of math and physics, just as Ada did. Otherwise, they will be left as bystanders at the intersection of arts and science, where most digital-age creativity will occur. They will surrender control of that territory to the engineers.
Many people who extol the arts and the humanities, who applaud vigorously the paeans to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don’t understand math or physics. They would consider people who don’t know Hamlet from Macbeth to be Philistines, yet they might merrily admit that they don’t know the difference between a gene and a chromosome, or a transistor and a diode, or an integral and differential equation. These things may seem hard. Yes, but so, too, is Hamlet. And like Hamlet, each of these concepts is beautiful. Like an elegant mathematical equation, they are expressions of the glories of the universe.
The next phase of the digital revolution will bring a true fusion of technology with the creative industries, such as media, fashion, music, entertainment, education, literature, and the arts. Until now, much of the innovation has involved pouring old wine – books, newspapers, opinion pieces, journals, songs, television shows, movies – into new digital bottles. But the interplay between technology and the creative arts will eventually result in completely new forms of expression and media.
This innovation will come from people who are able to link beauty to engineering, humanity to technology, and poetry to processors. In other words, it will come from the spiritual heirs of Ada Lovelace, creators who can flourish where the arts intersect with the sciences, and who have a rebellious sense of wonder that opens them to the beauty of both.