Crash Course: The History of Computers, Microchips, and the Internet

innovators bookThis is the story of the unsung innovators. The nameless Alexander Graham Bells. The forgotten Steve Jobs. The Thomas Edisons known only to those who truly care. Who invented the transistor? The microchip? The internet? A round of applause if you can answer any of those questions. I could not before reading The Innovators.

Many of these innovations cannot actually be attributed to a single person—they are an amalgamation of the contributions of many people. As you read The Innovators, a couple trends become clear. Creativity is a collaborative process. “Innovation comes from teams more often than from the lightbulb moments of lone geniuses” (480). The best success was achieved when a visionary and a doer collaborated. These complementary leadership styles were built for success, with the visionary leading the way with bright ideas and motivation, and the engineers and decision makers making it happen. Many of the people behind these teams grew up in the convivial midwest, and most spent their childhood tinkering with ham radios and DIY kits. This goes to show that early exposure to technology can inspire a lifelong career.

“Innovation requires having at least three things: a great idea, the engineering talent to execute it, and the business savvy (plus the deal-making moxie) to turn it into a successful product” (215).

As I began to summarize The Innovators, I realized my task would be quite difficult. It feels sacrilegious to brush over certain inventions and teams since ideas build on each other, so I decided to properly summarize each part rather than skimming most and focusing on just one. “This is the story of these pioneers, hackers, innovators, and entrepreneurs—who they were, how their minds worked, and what made them so creative” (1). This is the crash course to the history of innovation.

1840s, Ada, Countess of Lovelace

Our story begins well before you might expect: in the 1840s. As the daughter of the romantic poet Lord Byron and a mathematically trained mother, Lady Lovelace developed a passion for what she called “poetical science”—the art and beauty of mathematics. She worked with Charles Babbage, an inventor whose machine, the Analytical Engine, solved polynomial equations using a mechanical step-by-step method. Ada saw the future of computers by studying this machine. Her paper detailing the future computer and her complex algorithm of how a computer would work became the inspiration for the computers of the future. When asked “Who invented the computer?” Ada Lovelace certainly thought it up first.

1940s, Computers

eniac.jpg

The question of who built the first computer is more up in the air. Ideas built off one another, and some concepts were developed by multiple people at the same time. For all intents and purposes, a computer must be:

  • Digital: discrete integers, often binary, rather than analog
  • Electronic: vacuum tubes, transistors, or microchips rather than mechanical switches
  • Programmable: able to be programmed for general purpose use, rather than specialized use

1937 was a big year for Turing, Aiken, and Atanasoff. Alan Turing came up with his idea of a universal computer in his paper called “On Computable Numbers.” This thought experiment of a machine would read instructions and then carry out tasks. At Harvard, Howard Aiken was struggling with tedious calculations when he discovered Charles Babbage’s old 1840s machine. Inspired, he proposed the Mark I computer; but Harvard disliked funding practical innovations, preferring theoretical academia instead. In rural Iowa, John Atanasoff began making a programmable, partly mechanical computer that solved linear equations with vacuum tubes. Unfortunately, Iowa did not have the resources for him to succeed at scale.

The war spurred innovation. In London in 1943, Turing helped build Colossus, an electronic, single-purpose computer designed to break German war codes (yes, this is what the Imitation Game was about!). The war also spurred the Navy to fund Howard Aiken’s Mark I at Harvard. The Mark I would be digital and programmable, but partially mechanical.

Until this point, computers struggled to be both electronic and programmable. John Mauchly, a physicist, fused ideas from other inventors by attending events and even visiting Atanasoff in Iowa to study his computer (later causing a dispute over intellectual property rights). He then teamed up with J. Presper Eckert, a quirky, a perfectionist engineer who complimented Mauchly, the visionary physicist. The military funded their venture to use high speed vacuum tubes to calculate missile trajectories using what they called ENIAC: Electronic Numerical Integrator and Computer. This digital, electronic computer was programmable in theory, and became the basis of future computers.

Programming

jean jennings francis bilas.jpg

Jean Jennings and Frances Bilas

“We do not need to have an infinity of different machines doing different jobs. A single one will suffice. The engineering problem of producing various machines for various jobs is replaced by the office work of ‘programming’ the universal machine to do these jobs” -Turing in 1948

Since many male inventors grew up tinkering with technology, they perceived the hardware to be the most important part, and thus, a “man’s job”. Programming was “relegated” to female mathematicians. Little did they know that programming would become more important than the hardware.

You may have heard of the Grace Hopper Celebration. Grace Hopper was a math PhD who valued concise explanations of complex concepts. This made her the ideal candidate to write a book on how to program Howard Aiken’s Mark I at Harvard in the 1950s. Meanwhile in Mauchly and Eckert’s lab, mathematicians Jean Jennings, Marlyn Wescoff, Ruth Lichterman, Betty Snyder, Frances Bilas, and Kay McNulty all programmed the ENIAC. These women were the reason the ENIAC worked so well, yet they were not even invited to the public unveiling of the machine. The sexism in the industry prevailed.

The Transistor

Bardeen_Shockley_Brattain_1948-2

Bardeen, Shockley, and Brittain

Transistors allowed for big processing power in small spaces. “The transistor became to the digital age what the steam engine was to the Industrial Revolution” (131). This is also the story of how the paranoid William Shockley unintentionally brought together the best engineers of his time. The brilliant yet egotistical Shockley worked with the cantankerous Walter Brittain and the quiet John Bardeen. The three bounced ideas off each other in a petri dish of creativity. One day, Brittain and Bardeen created the transistor by putting together “strips of gold foil, a chip of semiconducting material, and a bent paper clip. When wiggled just right, it could amplify an electric current and switch it on and off” (131).

Shockley, infuriated that he was not personally involved in this particular innovation, left for California to found Shockley Semiconductor. He recruited Robert Noyce, a midwestern engineer, and Gordon Moore, a chemist. Things were quite fun and innovative for a while, until Shockley’s paranoia took over. Noyce and Moore left to found their own company with six other engineers. They were funded by Fairchild Camera.

The Microchip

Robert_Noyce_with_Motherboard_1959

Noyce with a microchip diagram

Separately, both Noyce at Fairchild and Jack Kilby at Texas Instruments created the microchip: silicon with certain impurities arranged on a single slab. Kilby’s version was a mess of tiny gold wires, whereas Noyce’s version engraved windows in the silicon to lay down gold. Gordon Moore saw the power of the microchip, and declared that chip performance would double every two years. Moore’s law became a self-fulfilling prophecy.

Noyce and Moore eventually left Fairchild to found a new company. They had no business plan, but their pure talent spoke to investment banker Arthur Rock, who said, “It was the only investment that I’ve ever made that I was 100 percent sure would succeed” (187). This was the start of venture capital. Noyce and Moore’s company? Intel.

Intel had the perfect combination of leadership. Noyce was a visionary, Moore was a brilliant scientist, and they brought in Andy Grove as the hard-charging decision maker. While we typically associate open floor plans and democratized decision making as a recent innovation in workplace culture, these ideas were central to Intel’s culture in the 1970s.

When one Intel engineer, Ted Hoff, “realized that it was wasteful and inelegant to design many types of microchips that each had a different function…he envisioned creating a general-purpose chip that could be instructed, or programmed, to do a variety of different applications as desired.” Thus, the microprocessor was born, paving the way to smarter traffic lights, coffeemakers, elevators, personal computers, and more.

Video Games

We can’t forget the contribution of video games to computer culture. MIT hacker Steve Russell pioneered the idea of open-source content, with his video game Spacewar. At Atari, Noah Bushwell and Al Alcorn created the wildly popular Pong game, which “hit upon one of the most important engineering challenges of the computer age: creating user interfaces that were radically simple and intuitive” (212).

The Internet

Roberts1960.jpg

Larry Roberts

Your parents or grandparents might have mentioned their early experiences with computers consisted of handing a meticulous stack of punch cards to a computer operator. It was time for human-computer symbiosis.

To kick things off, MIT professor Vannevar Bush published “Science, the Endless Frontier” to remind politicians of the importance of funding scientific research. Based on this report, the National Science Foundation was established. The Defense Department funded the Advanced Research Projects Agency, or ARPA.

At ARPA, the persuasive Bob Taylor and the intense Larry Roberts decided that there should be a network to connect the ARPA computers and share resources, all managed by what they called “routers.” They based this connectivity off a concept called “packet switching,” developed by Paul Baran, Donald Davies, and Leonard Kleinrock. Messages are broken down into “packets” and sent node-to-node through a decentralized network to their destination. “It’s like breaking a long letter into dozens of postcards, each numbered and addressed to the same place. Each may take different routes to get to the destination, and then they’re reassembled” (238). The network Taylor and Roberts created was called ARPANET. Robert Khan and Vint Cerf standardized this network by using Internet Protocols (IP) as a template for addressing the packets and Transmission Control Protocols (TCP) to instruct how to to put the packets back together. With TCP/IP, the internet was born.

Personal Computer

altair-8800-front.jpg

Vannevar Bush published another influential paper, “As We May Think,” to conceptualize the personal computer. He called it a memex, and imagined it would store personal files and pictures via direct entry, like a keyboard, and serve as an “enlarged intimate supplement to memory” (264).

At ARPA, Doug Engelbert and Bill English created the mouse and an “oNLine System” (NLS) that had “graphics, multiple windows on a screen, digital publishing, document sharing, and email” (278). These innovations made a computer seem less intimidating to the average person.

At Xerox Palo Alto Research Center (PARC), Alan Kay envisioned a simple and friendly personal computer called the Dynabook. Inspired by how an Italian printer realized personal books would need to fit into saddlebags, he decided the Dynabook would have to be the size of a notebook. He eventually built an interim version called the Xerox Alto with a revolutionary bitmapped display, but the corporate bigwigs back East did not show much interest.

Meanwhile in Albuquerque, serial entrepreneur Ed Roberts founded a company called MITS and created the Altair personal computer. You could flip a few switches and the computer would display binary code answers via flashing lights. It wasn’t fancy, but this make-your-own computer made the cover of Popular Electronics. It also sparked excitement among hacker clubs such as the Homebrew Computer Club in Palo Alto.

Software

103507261-GettyImages-96210965.1910x1000

Steve Wozniak and Steve Jobs

Bill Gates never cared much for tinkering around or building ham radios. As a child, his extreme intellect helped him befriend Paul Allen, who was socially gregarious. The two founded Lakeside Programming Group with a few friends in their school’s computer room, where they worked on various professional projects in exchange for computer time. Gates went to Harvard and Allen went to Washington State. After seeing the Altair on the cover of Popular Electronics, they were determined to write programs for it. The two, along with math student Monte Davidoff, worked in Harvard’s computer lab, coding for up to 36 hours before abruptly crashing. Once they had a working program called Microsoft BASIC, they licensed it out to MITS’s Altair, and it soon became the standard. Even Homebrew Computer Club got a hold of the code.

Meanwhile, Steve Wozniak and Steve Jobs hacked away—Woz would code a fun program and Jobs would find a way to monetize it. “Woz was an angelic naif who looked like a panda, Jobs a demon-driven mesmerizer who looked like a whippet” (346). One day, Woz attended a Homebrew Computer Club meeting and got a glimpse at the Altair and its code. This inspired the Apple Computer. “The Apple II was the first personal computer to be simple and fully integrated, from the hardware to the software” (353). Apple was catapulted to success when Dan Bricklin created Visicalc on the Apple II, a spreadsheet product similar to an early Excel. People could now understand that personal computers were useful, and they wanted one.

Remember the Xerox Alto with its bitmapped display? Steve Jobs saw it on a tour of Xerox PARC once, and decided to blatantly copy the idea by developing a graphical user interface for the Macintosh. Jobs was worried Microsoft would copy the GUI as well, so he made a deal giving Apple a 1-year head start. Microsoft did not yet make an operating system, but were developing one for IBM. Jobs had underestimated how much time Apple would need, so Windows got up and running just before the Macintosh.

Online

aol

The internet and computers advanced separately throughout the 70s, and didn’t cross paths until the personal computer in the late 80s when people could dial up on their own thanks to modems.

William von Meister was brilliant at coming up with ideas, but terrible at running companies. After several failed attempts, he created America Online in 1993. AOL was like “going online with training wheels. It was unintimidating” and friendly (399).

While many jokes are cracked about Al Gore, he actually did play a big role by pushing through bipartisan policy that allowed for the development of the Internet. “It’s a mark of our political discourse that one of the significant nonpartisan achievements on behalf of American innovation got turned into a punchline because of something that Gore never quite said—that he invented the Internet” (402).

The World Wide Web

tim berners lee.jpeg

Tim Berners-Lee

As a child, Tim Berners-Lee loved a Victorian almanac and advice book called Enquire Within Upon Everything. As an adult, he wanted to create a collaborative playground where people could enquire within to access anything. Hypertext was at the core of it, which links documents to other content. Working with Robert Cailliau and Richard Stallman, he named documents using Uniform Resource Locators (URL), with Hypertext Transfer Protocols (HTTPS) to allow hypertext to be exchanged online. They also made Hypertext Markup Language (HTML) for creating pages, and a rudimentary browser to access it all. They created the World Wide Web. “Web protocols be available freely, shared openly, and put forever in the public domain” (413).

Innovations on the World Wide Web came quickly. Marc Andreeson created Mosaic, a beautiful browser that was essentially a platform for published content. Justin Hall, a freshman at Swarthmore College, created a website called “Justin’s Links from the Underground” which was a directory of interesting websites. He became the first “blogger” by publishing his own thoughts, on a web log (condensed to weblog, and then to blog).

Jimmy Wales loved the World Book Encyclopedia, but wished he had access to more. He created Nupedia as a rigorously reviewed online encyclopedia with content written by experts, but it was not fun. After picking up the idea of wikis, Wikipedia took off. The wisdom of crowds makes Wikipedia more accurate and neutral because it is edited by all sides. Interestingly, Wales named his daughter Ada, after Lady Lovelace.

Eventually, hand-complied directories could not keep up with the pace of the web. Enter: Google. Larry page was born computing. He met Sergey Brin at orientation in Stanford’s grad program. Brin was gregarious and charmingly brash, while Page was quiet, intellectual, and reflective. They realized problem with Tim Berners-Lee’s hypertext was that it wasn’t bidirectional—you didn’t know which sites were linking to each page. Page and Brin essentially followed hypertext in reverse to map the web and return search results. Google was born. They took this web crawling to the next level with PageRank, which indexed the results based on quality. PageRank used a recursive process based on the quality of links coming into and going out of each page. This was mathematically complex, and quite different from existing web crawlers. Google finally connected humans and computers in symbiosis.

Artificial Intelligence

AP110114053298.jpg

Watson AI winning Jeopardy!

Artificial intelligence had been studied since 1956. Repeatedly, announcements are made that AI is almost here, but it’s been 60 years. Current AI innovations are impressive, yet they still do not function like a human brain: Deep Blue wins at chess thanks to brute force of calculations, and Watson wins at Jeopardy! thanks to enormous computing power. It is said that “the hard problems are easy and the easy problems are hard” (472). While chess is easy, if AI is asked, “Can a crocodile play basketball?” it will struggle. A 4-year old human could answer this, but the human brain just works differently than the silicon-based binary logic. Symbiosis is the key to augment human intellect. The most powerful partnerships occur when humans work computers, such as chess masters working with Deep Blue in tournaments.

So, would I recommend The Innovators?

Yes. I feel a lot more informed about how all these innovations came to be after reading this book. I had previously read Steve Jobs by Walter Isaacson, and loved his journalistic style. However, it is a technically dense, 500-page book. If you’re not up for the challenge, hopefully this book summary will help!

8 thoughts on “Crash Course: The History of Computers, Microchips, and the Internet

  1. Hey Katie, I loved your crash course! It’s impressive that you retained so much from such a (in your words) technically dense book. If I ever take a computer software history course I will be sure to re-read this!

    Liked by 2 people

  2. Hi Katie! Amazing overview of some of the greatest innovations of all time. I feel so much more knowledgeable just having read your summary; I can only imagine how much you gained from reading this book! I’ll definitely keep this title in mind if I want to delve deeper into the history of computer technology.

    Liked by 1 person

  3. I loved this book. I do agree that it’s dense, but it gives you such a concise history of computing that I absolutely think it was worth the investment. You should probably keep it as a reference for the future. I particularly liked the last chapter about the future.

    If you liked this one, another one I can recommend is Bill Bryson’s A Short History of Nearly Everything. It’s a similar approach to science in general (and Bryson has a great sense of humor). I read it again about every 5 years.

    And this is an EXTREMELY thorough summary. Thank you for taking the time!

    Liked by 2 people

  4. Wow, Katie! This is so comprehensive and beautifully organized! I love how you said it would be sacrilegious if you left anything out- and you certainly didn’t! I learned so much from this post I don’t even know where to start. I think I have 2 things that really stood out to me:
    1. Your observation about the power of collaboration is something to keep in mind. The common theme with a lot of these origin stories is that it took a dynamic pair or team or series of individual contributors to create these inventions. Humans working together are pretty amazing.
    2. Artificial intelligence has been around since the 1950s, yet it still hasn’t come as far as we would have thought. I liked your point about what seems hard to humans could be easily accomplished by machines but what is easy to us can stump even the most sophisticated machines. It gives me some comfort to know that we still have some inherent advantages and that when we collaborate with technology we are strongest. Humans working with machines are pretty amazing.

    Liked by 2 people

  5. Hi Katie, awesome & well-organized overview of computer history! I feel like I just read a crash course on computer history. I’m especially excited to see what our (and our future) generations will have in the future especially regarding with Artificial Intelligence!

    Liked by 1 person

  6. Kudos to you, Katie. This summary is extremely detailed. My favorite takeaway from this book is that innovation is never one person, but we often credit inventions in the past to just one person. Just like how I learned in my book that most people think of Jeff Bezos as the inventor of Amazon, however Shel Kaphan was the original technical engineer behind Bezos’s vision. The power of collaboration is definitely something we should keep in mind in this course!

    Liked by 2 people

  7. Hi Katie! Thank you so much for this extremely informative and interesting book summary. I learned pieces of the history from different places, but this book and your summary certainly makes all dots connect! I also agree with others’ thoughts on human collaboration with machines. AI should be something exciting not so threatening as sometimes it is portrayed.

    Liked by 1 person

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s