Give bestselling biographer Walter Isaacson credit. His most popular titles have focused on idiosyncratic, imaginative geniuses: Benjamin Franklin, Albert Einstein and Steve Jobs. But heʼs clearly not content to stick with the easy-to-regurgitate mythology that often accompanies the popular perception of these singular figures.
From Isaacsonʼs perspective, visionaries like Franklin, Einstein and Jobs succeeded, to varying degrees, not only because they could envision Big Ideas but also effectively execute them. And execution requires the ability to collaborate.
His latest popular history, The Innovators, which was longlisted for the National Book Award, repeatedly returns to this theme: The role that teamwork and collaboration have played in the dizzying chain of technological advances that defines the Digital Age.
Isaacson started on this project in the early ’aughts, but set it aside in 2009 to finish the Jobs biography, which was published weeks after the Apple visionary died of a rare form of pancreatic cancer in October 2011. Isaacsonʼs original concept was narrower in focus: The teams that invented the modern-day Internet. But Microsoft founder Bill Gates persuaded him that the more compelling tale would illustrate the simultaneous but separate emergence of the personal computer and the networks that became the Internet. The world truly changed when those two story-lines collided, crossed and merged.
Isaacson succeeds in telling an accessible tale tailored to a general interest audience. He avoids the overhyped quicksand that swallows many technology writers as they miscast tiny incremental advances as “revolutionary.ʼʼ
Instead Isaacson focuses on the evolutionary nature of progress. The Innovators succeeds in large part because Isaacson repeatedly shows how these visionaries, through design or dumb luck, were able to build and improve on the accomplishments of previous generations.
Many of the featured math geeks, engineers and inventors should be recognizable to anyone with the passing knowledge of a smartphone and the ability to operate a search engine: Alan Turing; John Von Neumann; Robert Noyce; Tim Berners-Lee; Gates and Paul Allen; Jobs and Steve Wozniak; Jimmy Wales; Larry Page and Sergey Brin; and Evan Williams.
Isaacson elevates the tale — and reinforces the running theme about collaborative creativity — when he profiles some of the lesser-known figures toiling in academic, government and private-sector laboratories and businesses such as Bell Labs, IBM and Intel. Their work led to the development of electronic circuitry, transistors, the first mainframes, programming languages, cryptology, microchips, the personal computer, the precursor networks to the World Wide Web and the Internet, search engines and todayʼs social networks.
While this is a story dominated by men, Isaacson takes great care to highlight the under-appreciated contributions of a handful of women, including Grace Hopper, who helped program the Harvard Mark I computer in the early 1940s, and the six women who worked largely under the cloak of wartime secrecy on the original ENIAC computer at the University of Pennsylvania. Underwritten by the U.S. Army, the ENIAC was initially designed to help calculate atomic bomb missile trajectories, but as the inventors soon realized, it could be reprogrammed as one of the earliest “general purposeʼʼ mainframe machines.
The spiritual mother of The Innovators is Ada Lovelace, the estranged — and strange — daughter of the Romantic poet Lord Byron. Lovelace became a patron and muse to British mathematician Charles Babbage, who started building mammoth calculating machines in the 1830s and improved on them after studying mechanized looms. An essay Lovelace published in 1843 about Babbageʼs Analytical Engine was largely dismissed at the time but is now considered a groundbreaking text by some computer historians. In addition to devising some elaborate codes for Babbageʼs machine, she envisioned a future of “poetical science,” in which machines could become creative partners in the human imagination.
“This insight,” Isaacson writes, “would become the core concept of the digital age: any piece of content, data or information — music, text, pictures, numbers, symbols, sounds, video — could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. But Ada realized that the digits on the cogs could represent things other than mathematical quantities. Thus did she make the conceptual leap from machines that were mere calculators to the ones that we now call computers.”
While recognizing that some computer scientists and historians have cast doubt on Lovelace’s celebration as a feminist icon and computer pioneer, Isaacson considers her contributions “profound and inspirational.”
Larry Lebowitz is a Miami writer.