The innovators How a group of hackers, geniuses, and geeks created the digital revolution

Walter Isaacson

Sound recording - 2014

"Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson's revealing story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and an indispensable guide to how innovation really happens. What were the talents that allowed certain inventors and entrepreneurs to turn their visionary ideas into disruptive realities? What led to their creative leaps? Why did some succeed and others fail? In his masterly saga, Isaacson begins with Ada Lovelace, Lord Byron's daughter, who pioneered computer programming in the 1840s. He explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan T...uring, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page. This is the story of how their minds worked and what made them so inventive. It's also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative. For an era that seeks to foster innovation, creativity, and teamwork, The Innovators shows how they happen"--

Saved in:

2nd Floor Show me where

COMPACT DISC/004.0922/Isaacson
1 / 1 copies available
Location Call Number   Status
2nd Floor COMPACT DISC/004.0922/Isaacson Checked In
Subjects
Published
[New York, NY] : Simon & Schuster Audio p2014.
Language
English
Main Author
Walter Isaacson (-)
Other Authors
Dennis Boutsikaris (-)
Edition
Unabridged
Physical Description
15 audio discs (17 hr., 30 min.) : digital ; 4 3/4 in
ISBN
9781442376229
  • Ada, Countess of Lovelace
  • the computer
  • Programming
  • the transistor
  • The microchip
  • Video games
  • The Internet
  • The personal computer
  • Software
  • Online
  • The Web
  • Ada forever.
Review by Choice Review

Isaacson (CEO, Aspen Institute) follows his Jobs biography, Steve Jobs (CH, Apr'12, 49-4500), with an exceptional history of the innovations that drove the digital revolution. Besides revealing the technologies involved, he integrates succinct profiles of important individuals and corporations, emphasizing the management styles deployed that either encouraged innovation or foiled success. The collaboration between Ada Lovelace and Charles Babbage in the 1840s launched the digital revolution. Babbage's Analytical Engine and Lovelace's accompanying commentary and algorithms were inspirational for later generations. The author discusses the transformation of the 19th-century world of human calculators into today's digital world of the web, and explains that ubiquitous computers, smart appliances, and virtual social spaces required many significant innovations. Switching circuits, transistors, microchips, microprocessors, the mouse, and memory storage were prerequisite; the conceptual shift away from single-use computers, e.g., the ENIAC for hydrogen bomb calculations, to multipurpose programmable computers was critical. The journey of innovation continued with the birth of time-sharing and ARPANET, which evolved into the Internet; the successful launch of personal computers by Gates and Jobs; e-mail, Usenet groups, and bulletin boards creating community; and operating systems like Linux becoming open and free. Isaacson concludes his engaging history with recent innovations that are building the web. Summing Up: Highly recommended. All readership levels. --Mark Mounts, Dartmouth College

Copyright American Library Association, used with permission.
Review by New York Times Review

during the H-bomb testing frenzy of the 1950s, a RAND Corporation researcher named Paul Baran became concerned about the fragility of America's communications networks. The era's telephone systems required users to connect to a handful of major hubs, which the Soviets would doubtless target in the early hours of World War III. So Baran dreamed up a less vulnerable alternative: a decentralized network that resembled a vast fishnet, with an array of small nodes that were each linked to a few others. These nodes could not only receive signals but also route them along to their neighbors, thereby creating countless possible paths for data to keep flowing should part of the network be destroyed. This data would travel across the structure in tiny chunks, called "packets," that would self-assemble into coherent wholes upon reaching their destinations. When Baran pitched his concept to AT&T, he was confident the company would grasp the wisdom of building a network that could withstand a nuclear attack. But as Walter Isaacson recounts in "The Innovators," his sweeping and surprisingly tenderhearted history of the digital age, AT&T's executives reacted as if Baran had asked them to get into the unicorn-breeding business. They explained at length why his "packet-switching" network was a physical impossibility, at one point calling in 94 separate technicians to lecture Baran on the limits of the company's hardware. "When it was over, the AT&T executives asked Baran, 'Now do you see why packet switching wouldn't work?'" Isaacson writes. "To their great disappointment, Baran simply replied, 'No.'" AT&T thus blew its chance to loom large in technological lore, for packet switching went on to become a keystone of the Internet. But the company can take solace in the fact that it was hardly alone in letting knee-jerk negativity blind itself to a tremendous digital opportunity: Time and again in "The Innovators," powerful entities shrug their shoulders when presented with zillion-dollar ideas. Fortunately for those of us who now feel adrift when our iPads and 4G phones are beyond arm's reach, the Paul Barans of the world are not easily discouraged. Stubbornness is just one of the personality traits ubiquitous among the brilliant subjects of "The Innovators." Isaacson identifies several other virtues that were essential to his geeky heroes' success, none of which will surprise those familiar with Silicon Valley's canon of self-help literature: The digital pioneers all loathed authority, embraced collaboration and prized art as much as science. Though its lessons may be prosaic, the book is still absorbing and valuable, and Isaacson's outsize narrative talents are on full display. Few authors are more adept at translating technical jargon into graceful prose, or at illustrating how hubris and greed can cause geniuses to lose their way. Having chosen such an ambitious project to follow his 2011 biography of the Apple co-founder Steve Jobs, Isaacson is wise to employ a linear structure that gives "The Innovators" a natural sense of momentum. The book begins in the 1830s with the prescient Ada Lovelace, Lord Byron's mathematically gifted daughter, who envisioned a machine that could perform varied tasks in response to different algorithmic instructions. (Isaacson takes pains throughout to salute the unheralded contributions of female programmers.) The story then skips ahead to the eve of World War II, when engineers scrambled to build machines capable of calculating the trajectories of missiles and shells. One of these inventors was John Mauchly, a driven young professor at Ursinus College. In June 1941, he paid a visit to Ames, Iowa, where an electrical engineer named John Atanasoff had cobbled together an electronic calculator "that could process and store data at a cost of only $2 per digit" - a seemingly magical feat. Against the advice of his wife, who suspected that Mauchly was a snake, Atanasoff proudly showed off his ragtag creation. Soon thereafter, Mauchly incorporated some of Atanasoff's ideas into Eniac, the 27-ton machine widely hailed as the world's first true computer. The bitter patent fight that ensued would last until 1973, with Atanasoff emerging victorious. Mauchly is often demonized for stealing from that most romantic of tech archetypes, the "lone tinkerer in a basement" who sketched out brainstorms on cocktail napkins. But Isaacson contends that men like Atanasoff receive too much adulation, for an ingenious idea is worthless unless it can be executed on a massive scale. If Mauchly hadn't come to Iowa to "borrow" his work, Atanasoff would have been "a forgotten historical footnote" rather than a venerated father of modern computing. Isaacson is not nearly as sympathetic in discussing the sins of William Shockley, who shared a 1956 Nobel Prize in Physics for co-inventing the transistor. Shockley is the book's arch-villain, a glory hog whose paranoid tendencies destroyed the company that bore his name. (He once forced all his employees to take lie-detector tests to determine if someone had sabotaged the office.) His eight best researchers quit and went on to found Fairchild Semiconductor, arguably the most seminal company in digital history; Shockley, meanwhile, devolved into a raving proponent of odious theories about race and intelligence. The digital revolution germinated not only at button-down Silicon Valley firms like Fairchild, but also in the hippie enclaves up the road in San Francisco. The intellectually curious denizens of these communities "shared a resistance to power elites and a desire to control their own access to information." Their freewheeling culture would give rise to the personal computer, the laptop and the concept of the Internet as a tool for the Everyman rather than scientists. Though Isaacson is clearly fond of these unconventional souls, his description of their world suffers from a certain anthropological detachment. Perhaps because he's accustomed to writing biographies of men who operated inside the corridors of power - Benjamin Franklin, Henry Kissinger, Jobs - Isaacson seems a bit baffled by committed outsiders like Stewart Brand, an LSD-inspired futurist who predicted the democratization of computing. He also does himself no favors by frequently citing the work of John Markoff and Tom Wolfe, two writers who have produced far more intimate portraits of '60s counterculture. Yet this minor shortcoming is quickly forgiven when "The Innovators" segues into its rollicking last act, in which hardware becomes commoditized and software goes on the ascent. The star here is Bill Gates, whom Isaacson depicts as something close to a punk - a spoiled brat and compulsive gambler who "was rebellious just for the hell of it." Like Paul Baran before him, Gates encountered an appalling lack of vision in the corporate realm - in his case at IBM, which failed to realize that its flagship personal computer would be cloned into oblivion if the company permitted Microsoft to license the machine's MS-DOS operating system at will. Gates pounced on this mistake with a feral zeal that belies his current image as a sweater-clad humanitarian. "The Innovators" cannot really be faulted for the hastiness of its final pages, in which Isaacson provides brief and largely unilluminating glimpses at Twitter, Wikipedia and Google. There is no organic terminus for the book's narrative, since digital technology did not cease to evolve the moment Isaacson handed in his manuscript. As a result, any ending was doomed to feel dated. (There is, for example, but a single passing mention of the digital currency Bitcoin.) But even at its most rushed, the book evinces a genuine affection for its subjects that makes it tough to resist. Isaacson confesses early on that he was once "an electronics geek who loved Heathkits and ham radios," and that background seems to have given him keen insight into how youthful passion transforms into professional obsession. His book is thus most memorable not for its intricate accounts of astounding breakthroughs and the business dramas that followed, but rather for the quieter moments in which we realize that the most primal drive for innovators is a need to feel childlike joy. BRENDAN I. KOERNER is a contributing editor at Wired and the author, most recently, of "The Skies Belong to Us: Love and Terror in the Golden Age of Hijacking."

Copyright (c) The New York Times Company [September 28, 2014]
Review by Booklist Review

*Starred Review* In 1843, Ada Lovelace, the daughter of Lord Byron, wrote in a letter to Charles Babbage that mathematical calculating machines would one day become general-purpose devices that link the operations of matter and the abstract mental processes, correctly predicting the rise of modern computers. Thus begins a remarkable overview of the history of computers from the man who brought us biographies of Steve Jobs, Benjamin Franklin, Albert Einstein, and Henry Kissinger. The story is above all one of collaboration and incremental progress, which lies in contrast to our fascination with the lone inventor. Here we find that in a world dominated by men with their propensity for hardware, the first contributions to software were made by women. While we have those storied partnerships of the digital age Noyce and Moore, Hewlett and Packard, Allen and Gates, and Jobs and Wozniak all of their contributions were built upon the advances of lesser-known pioneers, who are heralded in these pages. Although full biographies of the individuals profiled here have been written in spades, Isaacson manages to bring together the entire universe of computing, from the first digitized loom to the web, presented in a very accessible manner that often reads like a thriller.--Siegfried, David Copyright 2014 Booklist

From Booklist, Copyright (c) American Library Association. Used with permission.
Review by Publisher's Weekly Review

Starred Review. The history of the computer as told through this fascinating book is not the story of great leaps forward but rather one of halting progress. Journalist and Aspen Institute CEO Isaacson ( Steve Jobs) presents an episodic survey of advances in computing and the people who made them, from 19th-century digital prophet Ada Lovelace to Google founders Larry Page and Sergey Brin. His entertaining biographical sketches cover headline personalities (such as a manic Bill Gates in his salad days) and unsung toilers, like WWII's pioneering female programmers, and outright failures whose breakthroughs fizzled unnoticed, such as John Atanasoff, who was close to completing a full-scale model computer in 1942 when he was drafted into the Navy. Isaacson examines these figures in lucid, detailed narratives, recreating marathon sessions of lab research, garage tinkering, and all-night coding in which they struggled to translate concepts into working machinery. His account is an antidote to his 2011 Great Man hagiography of Steve Jobs; for every visionary--or three (vicious fights over who invented what are ubiquitous)--there is a dogged engineer; a meticulous project manager; an indulgent funder; an institutional hothouse like ARPA, Stanford, and Bell Labs; and hordes of technical experts. Isaacson's absorbing study shows that technological progress is a team sport, and that there's no I in computer. Photos. Agent: Amanda Urban, ICM. (Oct.) (c) Copyright PWxyz, LLC. All rights reserved.

(c) Copyright PWxyz, LLC. All rights reserved
Review by Library Journal Review

Starred Review. Taking a chronological, people-oriented approach, rather than a scientific or technical one, to the history of computers, the Internet, and digital technology, Isaacson (Steve Jobs) illuminates the ways teamwork, collaboration, and creativity have led to the current tech-driven world. As much biography as computer history, the work discusses such people, companies, and developments as 1840s computer programming pioneer Ada Lovelace, Vannevar Bush, Alan Turing, Doug Engelbart, Bill Gates, Steve Wozniak, Steve Jobs, IBM, ENIAC, Microsoft, and Apple. Dennis Boutsikaris's moderately paced, low-key reading makes the book an easy, thoroughly engrossing listen. This fascinating and unique work would be a nice companion to Erik Brynjolfsson and Andrew McAfee's The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. VERDICT This program will appeal to general listeners interested in the history of the computer age and entrepreneurs looking for nontraditional new industries, business models, and marketing concepts. ["Anyone who uses a computer in any of its contemporary shapes or who has an interest in modern history will enjoy this book," read the review of the S. & S. hc, LJ 9/15/14.]-Laurie Selwyn, formerly with Grayson Cty. Law Lib., Sherman, TX (c) Copyright 2015. Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.

(c) Copyright Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
Review by Kirkus Book Review

A panoramic history of technological revolution. "Innovation occurs when ripe seeds fall on fertile ground," Aspen Institute CEO Isaacson (Steve Jobs, 2011, etc.) writes in this sweeping, thrilling tale of three radical innovations that gave rise to the digital age. First was the evolution of the computer, which Isaacson traces from its 19th-century beginnings in Ada Lovelace's "poetical" mathematics and Charles Babbage's dream of an "Analytical Engine" to the creation of silicon chips with circuits printed on them. The second was "the invention of a corporate culture and management style that was the antithesis of the hierarchical organization of East Coast companies." In the rarefied neighborhood dubbed Silicon Valley, new businesses aimed for a cooperative, nonauthoritarian model that nurtured cross-fertilization of ideas. The third innovation was the creation of demand for personal devices: the pocket radio; the calculator, marketing brainchild of Texas Instruments; video games; and finally, the holy grail of inventions: the personal computer. Throughout his action-packed story, Isaacson reiterates one theme: Innovation results from both "creative inventors" and "an evolutionary process that occurs when ideas, concepts, technologies, and engineering methods ripen together." Who invented the microchip? Or the Internet? Mostly, Isaacson writes, these emerged from "a loosely knit cohort of academics and hackers who worked as peers and freely shared their creative ideas.Innovation is not a loner's endeavor." Isaacson offers vivid portraitsmany based on firsthand interviewsof mathematicians, scientists, technicians and hackers (a term that used to mean anyone who fooled around with computers), including the elegant, "intellectually intimidating," Hungarian-born John von Neumann; impatient, egotistical William Shockley; Grace Hopper, who joined the Army to pursue a career in mathematics; "laconic yet oddly charming" J.C.R. Licklider, one father of the Internet; Bill Gates, Steve Jobs, and scores of others. Isaacson weaves prodigious research and deftly crafted anecdotes into a vigorous, gripping narrative about the visionaries whose imaginations and zeal continue to transform our lives. Copyright Kirkus Reviews, used with permission.

Copyright (c) Kirkus Reviews, used with permission.

Innovators INTRODUCTION HOW THIS BOOK CAME TO BE The computer and the Internet are among the most important inventions of our era, but few people know who created them. They were not conjured up in a garret or garage by solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell, and Morse. Instead, most of the innovations of the digital age were done collaboratively. There were a lot of fascinating people involved, some ingenious and a few even geniuses. This is the story of these pioneers, hackers, inventors, and entrepreneurs--who they were, how their minds worked, and what made them so creative. It's also a narrative of how they collaborated and why their ability to work as teams made them even more creative. The tale of their teamwork is important because we don't often focus on how central that skill is to innovation. There are thousands of books celebrating people we biographers portray, or mythologize, as lone inventors. I've produced a few myself. Search the phrase "the man who invented" on Amazon and you get 1,860 book results. But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today's technology revolution was fashioned. It can also be more interesting. We talk so much about innovation these days that it has become a buzzword, drained of clear meaning. So in this book I set out to report on how innovation actually happens in the real world. How did the most imaginative innovators of our time turn disruptive ideas into realities? I focus on a dozen or so of the most significant breakthroughs of the digital age and the people who made them. What ingredients produced their creative leaps? What skills proved most useful? How did they lead and collaborate? Why did some succeed and others fail? I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by government spending and managed by a military-industrial-academic collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it-yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority. Histories can be written with a different emphasis on any of these factors. An example is the invention of the Harvard/IBM Mark I, the first big electromechanical computer. One of its programmers, Grace Hopper, wrote a history that focused on its primary creator, Howard Aiken. IBM countered with a history that featured its teams of faceless engineers who contributed the incremental innovations, from counters to card feeders, that went into the machine. Likewise, what emphasis should be put on great individuals versus on cultural currents has long been a matter of dispute; in the mid-nineteenth century, Thomas Carlyle declared that "the history of the world is but the biography of great men," and Herbert Spencer responded with a theory that emphasized the role of societal forces. Academics and participants often view this balance differently. "As a professor, I tended to think of history as run by impersonal forces," Henry Kissinger told reporters during one of his Middle East shuttle missions in the 1970s. "But when you see it in practice, you see the difference personalities make." 1 When it comes to digital-age innovation, as with Middle East peacemaking, a variety of personal and cultural forces all come into play, and in this book I sought to weave them together. The Internet was originally built to facilitate collaboration. By contrast, personal computers, especially those meant to be used at home, were devised as tools for individual creativity. For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere. Historians of science are sometimes wary about calling periods of great change revolutions, because they prefer to view progress as evolutionary. "There was no such thing as the Scientific Revolution, and this is a book about it," is the wry opening sentence of the Harvard professor Steven Shapin's book on that period. One method that Shapin used to escape his half-joking contradiction is to note how the key players of the period "vigorously expressed the view" that they were part of a revolution. "Our sense of radical change afoot comes substantially from them." 2 Likewise, most of us today share a sense that the digital advances of the past half century are transforming, perhaps even revolutionizing the way we live. I can recall the excitement that each new breakthrough engendered. My father and uncles were electrical engineers, and like many of the characters in this book I grew up with a basement workshop that had circuit boards to be soldered, radios to be opened, tubes to be tested, and boxes of transistors and resistors to be sorted and deployed. As an electronics geek who loved Heathkits and ham radios (WA5JTP), I can remember when vacuum tubes gave way to transistors. At college I learned programming using punch cards and recall when the agony of batch processing was replaced by the ecstasy of hands-on interaction. In the 1980s I thrilled to the static and screech that modems made when they opened for you the weirdly magical realm of online services and bulletin boards, and in the early 1990s I helped to run a digital division at Time and Time Warner that launched new Web and broadband Internet services. As Wordsworth said of the enthusiasts who were present at the beginning of the French Revolution, "Bliss was it in that dawn to be alive." I began work on this book more than a decade ago. It grew out of my fascination with the digital-age advances I had witnessed and also from my biography of Benjamin Franklin, who was an innovator, inventor, publisher, postal service pioneer, and all-around information networker and entrepreneur. I wanted to step away from doing biographies, which tend to emphasize the role of singular individuals, and once again do a book like The Wise Men, which I had coauthored with a colleague about the creative teamwork of six friends who shaped America's cold war policies. My initial plan was to focus on the teams that invented the Internet. But when I interviewed Bill Gates, he convinced me that the simultaneous emergence of the Internet and the personal computer made for a richer tale. I put this book on hold early in 2009, when I began working on a biography of Steve Jobs. But his story reinforced my interest in how the development of the Internet and computers intertwined, so as soon as I finished that book, I went back to work on this tale of digital-age innovators. The protocols of the Internet were devised by peer collaboration, and the resulting system seemed to have embedded in its genetic code a propensity to facilitate such collaboration. The power to create and transmit information was fully distributed to each of the nodes, and any attempt to impose controls or a hierarchy could be routed around. Without falling into the teleological fallacy of ascribing intentions or a personality to technology, it's fair to say that a system of open networks connected to individually controlled computers tended, as the printing press did, to wrest control over the distribution of information from gatekeepers, central authorities, and institutions that employed scriveners and scribes. It became easier for ordinary folks to create and share content. The collaboration that created the digital age was not just among peers but also between generations. Ideas were handed off from one cohort of innovators to the next. Another theme that emerged from my research was that users repeatedly commandeered digital innovations to create communications and social networking tools. I also became interested in how the quest for artificial intelligence--machines that think on their own--has consistently proved less fruitful than creating ways to forge a partnership or symbiosis between people and machines. In other words, the collaborative creativity that marked the digital age included collaboration between humans and machines. Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered. "I always thought of myself as a humanities person as a kid, but I liked electronics," Jobs told me when I embarked on his biography. "Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that's what I wanted to do." The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story. Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact. When Einstein was stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres. When it comes to computers, there is one other historical figure, not as well known, who embodied the combination of the arts and sciences. Like her famous father, she understood the romance of poetry. Unlike him, she also saw the romance of math and machinery. And that is where our story begins. Excerpted from The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.