71 results back to index
Valley of Genius: The Uncensored History of Silicon Valley (As Told by the Hackers, Founders, and Freaks Who Made It Boom) by Adam Fisher
Airbnb, Albert Einstein, AltaVista, Apple II, Apple's 1984 Super Bowl advert, augmented reality, autonomous vehicles, Bob Noyce, Brownian motion, Buckminster Fuller, Burning Man, Byte Shop, cognitive dissonance, disintermediation, don't be evil, Donald Trump, Douglas Engelbart, Dynabook, Elon Musk, frictionless, glass ceiling, Hacker Ethic, Howard Rheingold, HyperCard, hypertext link, index card, informal economy, information retrieval, Jaron Lanier, Jeff Bezos, Jeff Rulifson, John Markoff, Jony Ive, Kevin Kelly, Kickstarter, knowledge worker, life extension, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Maui Hawaii, Menlo Park, Metcalfe’s law, Mother of all demos, move fast and break things, move fast and break things, Network effects, new economy, nuclear winter, PageRank, Paul Buchheit, paypal mafia, peer-to-peer, Peter Thiel, pets.com, pez dispenser, popular electronics, random walk, risk tolerance, Robert Metcalfe, rolodex, self-driving car, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, social graph, social web, South of Market, San Francisco, Startup school, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, telerobotics, The Hackers Conference, the new new thing, Tim Cook: Apple, tulip mania, V2 rocket, Whole Earth Catalog, Whole Earth Review, Y Combinator
But Woz designed it to use a standard color television set, which could be gotten very cheaply. Steve Wozniak: It made it possible for a little one-dollar chip to generate color instead of a thousand-dollar color-generation board. Lee Felsenstein: Nobody had to pay for that additional hardware to do all that vector generation and phasing and so forth. Andy Hertzfeld: It was the single cleverest thing in the Apple II. That was one of the first revolutions. Steve Wozniak: And then I thought, I wonder if I can write a game that’s playable with my slow BASIC? Dan Kottke: A video game has to move fast. Maybe the BASIC was fast enough. But most video games would not be written in BASIC. Steve Wozniak: I had done Breakout for Atari. I knew Breakout. Would it be fast enough or would it be too slow?
If you can do this with this machine, imagine what could you do on a more serious machine—like the Alto or the machines that were coming. Chuck Thacker: VisiCalc was a thing that was useful to businesspeople, and a lot of them. And did increase the productivity of people who did who use that kind of tool—enormously. Butler Lampson: It was a success of the Apple II and VisiCalc that created the whole personal computer industry, really. Steve Wozniak: The Apple II was the only one of those computers, the three of them that existed, that had enough memory to run VisiCalc. So they had to write it for this computer only. Everybody else had to go back to the drawing board and make computers that could add floppy disks and more memory. So that was a big leap for us. It was an accident, too; we hadn’t really thought about how we were going to make sure that we were well ahead of the competition.
Bruce Horn: Atkinson was just looking at it so closely—trying to figure it out. Andy Hertzfeld: Bill Atkinson was the main graphics engineer at Apple, and he was doing the graphics on the Lisa. Bruce Horn: He was nose to the screen, just trying to figure it out. Trip Hawkins: We were not complete strangers to bitmapped graphics, because Apple II had them. It’s just what you could do with them on an Apple II was kind of limited. Trip Hawkins: What PARC had was completely innovative thinking about the entire user experience. Steve Wozniak: Multiple windows on the same computer screen? When I saw that I said, “God, it’s like you’ve got three computers in one! Once you have that you’ll never go back.” The Smalltalk language allowed them to write software in a different way than ever before. Adele Goldberg: So we proceeded to have a Smalltalk language and implementation discussion, practically giving the answers to the team.
Revolution in the Valley: The Insanely Great Story of How the Mac Was Made by Andy Hertzfeld
We’ll See About That November 1979 Burrell proves his mettle with the 80K language card Burrell Smith was a 23-year-old, self-taught engineer, without a college degree, who was drawn to Apple by the sheer elegance of the Apple II design. Apple hired him in February 1979 as employee #282, a lowly service technician responsible for fixing broken Apple IIs that were returned by customers. As he debugged broken logic boards, sometimes more than a dozen in a single day, he developed a profound respect and empathy for Steve Wozniak’s unique and creative design techniques. Meanwhile, the Lisa software team was writing their first code in Pascal running on Apple IIs because the Lisa hardware wasn’t ready yet. They had been at it for almost a year and had written more code than would fit in the 64K bytes of memory in a standard Apple II. In fact, the Apple II had only 48K bytes on its main board, but it used a “language” card to give it an extra 16K bytes to run Pascal.
Eventually, I became so obsessed with the Apple II that I had to go to work at the place that created it. I abandoned graduate school and started work as a systems programmer at Apple in August 1979. Even though the Apple II was overflowing with both technical and marketing genius, the best thing about it was the spirit of its creation. It was not conceived or designed as a commercial product in the usual sense. Apple cofounder Steve Wozniak was just trying to make a great computer for himself and impress his friends at the Homebrew Computer Club. His design somehow projected an audacious sense of infinite horizons, as if the Apple II could do anything, if you were just clever enough. Most of the early Apple employees were their own ideal customers. The Apple II was simultaneously a work of art and the fulfillment of a dream, shared by Apple’s employees and customers.
It involved much more work for me than the Apple I and Apple II computer designs together. I was too shy to talk and had to do impressive things to get others to speak first, and this BASIC did help. I had much floating-point experience. In fact, my floating-point math routines were included in the Apple II ROMS, although not incorporated into the BASIC. A floating-point BASIC was listed as one of the highest priorities for the Apple II in the same meeting that listed a floppy drive as the other one. Both were important for the Checkbook program that was shipping on cassette tape with the Apple II. Randy Wigginton and I were working on defining a rather advanced BASIC with floating point, and much more, when Microsoft sent us their 6502 BASIC. There was little need to work on our own at that point. Steve Wozniak Mea Culpa A confession of our worst mistakes Almost everyone involved with the design of the original Macintosh is proud of the work that they did on the project, both individually and collectively.
Fire in the Valley: The Birth and Death of the Personal Computer by Michael Swaine, Paul Freiberger
1960s counterculture, Amazon Web Services, Apple II, barriers to entry, Bill Gates: Altair 8800, Byte Shop, cloud computing, commoditize, computer vision, Douglas Engelbart, Douglas Engelbart, Dynabook, Google Chrome, I think there is a world market for maybe five computers, Internet of things, Isaac Newton, Jaron Lanier, job automation, John Markoff, John von Neumann, Jony Ive, Loma Prieta earthquake, Marc Andreessen, Menlo Park, Mitch Kapor, Mother of all demos, Paul Terrell, popular electronics, Richard Stallman, Robert Metcalfe, Silicon Valley, Silicon Valley startup, stealth mode startup, Steve Ballmer, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, Tim Cook: Apple, urban sprawl, Watson beat the top human players on Jeopardy!, Whole Earth Catalog
The Debut The young company faced a more modest challenge than tackling the company that had defined computer for generations: they had to finish the Apple II design in time for Jim Warren’s first West Coast Computer Faire in April and get it ready for production shortly thereafter. Markkula was already signing up distributors nationwide, many of whom were eager to work with a company that would give them greater freedom than microcomputer manufacturer MITS had, as well as provide a product that actually did something. * * * Figure 62. Steve Wozniak Woz scrambles for a phone in one of Apple’s early offices. (Courtesy of Margaret Kern Wozniak) Steve Wozniak is justly credited with the technical design of the Apple I and Apple II. Nevertheless, an essential contribution to making the Apple II a commercial success came from Jobs. Early microcomputers were typically drab and ugly metal boxes.
He helped Jobs with the business plan and obtained a line of credit for Apple at the Bank of America. He told Woz and Jobs that neither of them had the experience to run a company and hired a president: Michael Scott, nicknamed Scotty, a seasoned executive who had worked for him at Fairchild. Designing the Apple II By the fall of 1976, Woz had already made progress on the design of his new computer. The Apple II would embody all the engineering savvy he could bring to it. It would be the embodiment of Steve Wozniak’s dream computer, one he would like to own himself. He had made it considerably faster than the Apple I. There was a clever trick he wanted to try that would give the machine a color display, too. Wozniak was skittish about forming a company from the start, and now he was worried about working full time for it.
One day he sneaked into a programmer’s cubicle and placed a mouse inside his computer. When the programmer returned, it took him more than a few minutes to figure out why his Apple was squeaking. Meanwhile, without the singular vision of a Steve Wozniak, the Apple III project was floundering. Delays in the Apple III were soon causing concern in the marketing department. The young company was beginning to feel growing pains at last. When Apple was formed, the Apple II was already near completion. The Apple III was the first computer that Apple—as a company—had designed and built from scratch. The Apple III was also the first Apple not conceived by Steve Wozniak in pursuit of his personal dream machine. Instead the Apple III was a bit of a hodgepodge, pasted together by many hands and designed by committee. And, as is typical of anything created by committee, the various members weren’t completely happy with the results.
Hackers: Heroes of the Computer Revolution - 25th Anniversary Edition by Steven Levy
air freight, Apple II, Bill Gates: Altair 8800, Buckminster Fuller, Byte Shop, computer age, computer vision, corporate governance, Donald Knuth, El Camino Real, game design, Hacker Ethic, hacker house, Haight Ashbury, John Conway, John Markoff, Mark Zuckerberg, Menlo Park, non-fiction novel, Norman Mailer, Paul Graham, popular electronics, RAND corporation, reversible computing, Richard Stallman, Silicon Valley, software patent, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, The Hackers Conference, Whole Earth Catalog, Y Combinator
The Apple ad even said, “our philosophy is to provide software for our machines free or at minimal cost.” While the selling was going on, Steve Wozniak began working on an expanded design of the board, something that would impress his Homebrew peers even more. Steve Jobs had plans to sell many computers based on this new design, and he started getting financing, support, and professional help for the day the product would be ready. The new version of Steve Wozniak’s computer would be called the Apple II, and at the time no one suspected that it would become the most important computer in history. • • • • • • • • It was the fertile atmosphere of Homebrew that guided Steve Wozniak through the incubation of the Apple II. The exchange of information, the access to esoteric technical hints, the swirling creative energy, and the chance to blow everybody’s mind with a well-hacked design or program . . . these were the incentives which only increased the intense desire Steve Wozniak already had: to build the kind of computer he wanted to play with.
I believe their story—their vision, their intimacy with the machine itself, their experiences inside their peculiar world, and their sometimes dramatic, sometimes absurd “interfaces” with the outside world—is the real story of the computer revolution. Who’s Who: The Wizards and Their Machines Bob Albrecht. Founder of People’s Computer Company who took visceral pleasure in exposing youngsters to computers. Altair 8800. The pioneering microcomputer that galvanized hardware hackers. Building this kit made you learn hacking. Then you tried to figure out what to do with it. Apple II. Steve Wozniak’s friendly, flaky, good-looking computer, wildly successful and the spark and soul of a thriving industry. Atari 800. This home computer gave great graphics to game hackers like John Harris, though the company that made it was loath to tell you how it worked. Bob and Carolyn Box. World-record-holding gold prospectors turned software stars, working for Sierra On-Line. Doug Carlston. Corporate lawyer who chucked it all to form the Brøderbund software company.
TX-0. Filled a small room, but in the late fifties, this $3 million machine was world’s first personal computer—for the community of MIT hackers that formed around it. Jim Warren. Portly purveyor of “techno-gossip” at Homebrew, he was first editor of hippie-styled Dr. Dobbs Journal, later started the lucrative Computer Faire. Randy Wigginton. Fifteen-year-old member of Steve Wozniak’s kiddie corps, he helped Woz trundle the Apple II to Homebrew. Still in high school when he became Apple’s first software employee. Ken Williams. Arrogant and brilliant young programmer who saw the writing on the CRT and started Sierra On-Line to make a killing and improve society by selling games for the Apple computer. Roberta Williams. Ken Williams’ timid wife who rediscovered her own creativity by writing Mystery House, the first of her many bestselling computer games.
Steve Jobs by Walter Isaacson
air freight, Albert Einstein, Apple II, Apple's 1984 Super Bowl advert, big-box store, Bob Noyce, Buckminster Fuller, Byte Shop, centre right, Clayton Christensen, cloud computing, commoditize, computer age, computer vision, corporate governance, death of newspapers, don't be evil, Douglas Engelbart, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, fixed income, game design, Golden Gate Park, Hacker Ethic, hiring and firing, Jeff Bezos, Johannes Kepler, John Markoff, Jony Ive, lateral thinking, Mark Zuckerberg, Menlo Park, Mitch Kapor, Mother of all demos, Paul Terrell, profit maximization, publish or perish, Richard Feynman, Robert Metcalfe, Robert X Cringely, Ronald Reagan, Silicon Valley, skunkworks, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, supply-chain management, thinkpad, Tim Cook: Apple, Wall-E, Whole Earth Catalog
“Well, give me a hug,” he said. And so they hugged. But the biggest news that month was the departure from Apple, yet again, of its cofounder, Steve Wozniak. Wozniak was then quietly working as a midlevel engineer in the Apple II division, serving as a humble mascot of the roots of the company and staying as far away from management and corporate politics as he could. He felt, with justification, that Jobs was not appreciative of the Apple II, which remained the cash cow of the company and accounted for 70% of its sales at Christmas 1984. “People in the Apple II group were being treated as very unimportant by the rest of the company,” he later said. “This was despite the fact that the Apple II was by far the largest-selling product in our company for ages, and would be for years to come.” He even roused himself to do something out of character; he picked up the phone one day and called Sculley, berating him for lavishing so much attention on Jobs and the Macintosh division.
Steve Jobs, address to the Aspen Design Conference, June 15, 1983, tape in Aspen Institute archives; Apple Computer Partnership Agreement, County of Santa Clara, Apr. 1, 1976, and Amendment to Agreement, Apr. 12, 1976; Bruce Newman, “Apple’s Lost Founder,” San Jose Mercury News, June 2, 2010; Wozniak, 86, 176–177; Moritz, 149–151; Freiberger and Swaine, 212–213; Ashlee Vance, “A Haven for Spare Parts Lives on in Silicon Valley,” New York Times, Feb. 4, 2009; Paul Terrell interview, Aug. 1, 2008, mac-history.net. Garage Band: Interviews with Steve Wozniak, Elizabeth Holmes, Daniel Kottke, Steve Jobs. Wozniak, 179–189; Moritz, 152–163; Young, 95–111; R. S. Jones, “Comparing Apples and Oranges,” Interface, July 1976. CHAPTER 6: THE APPLE II An Integrated Package: Interviews with Steve Jobs, Steve Wozniak, Al Alcorn, Ron Wayne. Wozniak, 165, 190–195; Young, 126; Moritz, 169–170, 194–197; Malone, v, 103. Mike Markkula: Interviews with Regis McKenna, Don Valentine, Steve Jobs, Steve Wozniak, Mike Markkula, Arthur Rock. Nolan Bushnell, keynote address at the ScrewAttack Gaming Convention, Dallas, July 5, 2009; Steve Jobs, talk at the International Design Conference at Aspen, June 15, 1983; Mike Markkula, “The Apple Marketing Philosophy” (courtesy of Mike Markkula), Dec. 1979; Wozniak, 196–199.
“You haven’t produced anything.” Jobs began to cry, which was not unusual. He had never been, and would never be, adept at containing his emotions. He told Steve Wozniak that he was willing to call off the partnership. “If we’re not fifty-fifty,” he said to his friend, “you can have the whole thing.” Wozniak, however, understood better than his father the symbiosis they had. If it had not been for Jobs, he might still be handing out schematics of his boards for free at the back of Homebrew meetings. It was Jobs who had turned his ingenious designs into a budding business, just as he had with the Blue Box. He agreed they should remain partners. It was a smart call. To make the Apple II successful required more than just Wozniak’s awesome circuit design. It would need to be packaged into a fully integrated consumer product, and that was Jobs’s role.
Insanely Great: The Life and Times of Macintosh, the Computer That Changed Everything by Steven Levy
Apple II, Apple's 1984 Super Bowl advert, computer age, conceptual framework, Douglas Engelbart, Douglas Engelbart, Dynabook, Howard Rheingold, HyperCard, information retrieval, information trail, John Markoff, Kickstarter, knowledge worker, Marshall McLuhan, Mitch Kapor, Mother of all demos, Productivity paradox, QWERTY keyboard, rolodex, Silicon Valley, skunkworks, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Ted Nelson, the medium is the message, Vannevar Bush
In his nightmares, he churned out workmanlike code for creepy bosses in suits. Then he discovered the Apple II. "It changed my life," Andy told me on that first day we met. "The more I learned about it, the more I was impressed with its brilliance." He dropped out of graduate school and began writing Apple programs. One of his hacks filled a gap in the Apple II that Jef Raskin had first identified: it displayed only uppercase letters. His first impulse was to give the program away-in Andy Hertzfeld's mind, anything that helps people use a compurer more efficiently is a good in and of itself. But a friend convinced him to sell it, and Hertzfeld made $40,000 in a few months. Andy went to work for Apple in 1979. In some ways it was a dream; he had access to the secrets of the Apple II, and even began a friendship with his hero, Steve Wozniak. On the other hand, the company was just beginning its accommodation with hypergrowth, with some disturbing side effects.
Jobs's parsimony was rooted in a worldview shaped in the early days of personal computing, where very little memory meant quite a lot. The first computers discussed at the Homebrew Computer Club, Steve Wozniak's home base, had 4K memory, the rough equivalent of four pages of text. When the Apple II shipped with 48K memory, it was deemed an enormous expanse. Later, Apple II users would buy circuit boards to add 16K more, and then they really felt they were humming. So it is not surprising that Jobs felt 128K sufficient. But there were two errors to his thinking. First, since Macintosh was a bit-mapped machine, considerably more memory was required than with the raster-based Apple II or IBM-PC. When one of these displayed a word, it used a shortcut-there was a single one-byte computer code for each letter, and that chunk of code threw the letter on the screen.
As a consequence of the company's success, Apple very quickly had to shift from a garage mentality to the mindset of a budding corporation--one valued, at the time of the PARC visit, at over a billion dollars. It filled several low-slung office buildings in Cupertino, and had hundreds of employees. Though the Apple II was wonderful for its time, Apple's leaders realized that the company needed new products to remain competitive. They began work on-the Apple III, a machine roughly as powerful as IBM's personal computer would be. But Steve Jobs had an idea for something even more special-Lisa, a computer that would leapfrog Apple's technology, surpassing not only the Apple II, but Apple III as well. This jump would also vault Apple a generation or so past anything that its competitors were preparing. Begun when Steve Wozniak, at Steve Jobs's request, sketched its architecture on a napkin, Lisa had, in less than a year, evolved to a computer based on the powerful Motorola 68000 microprocessor chip, and was engineered to handle more complicated applications, even running several at the same time, a trick called "multitasking."
Exploding the Phone: The Untold Story of the Teenagers and Outlaws Who Hacked Ma Bell by Phil Lapsley
air freight, Apple II, Bill Gates: Altair 8800, Bob Noyce, card file, cuban missile crisis, dumpster diving, Hush-A-Phone, index card, Jason Scott: textfiles.com, John Markoff, Menlo Park, popular electronics, Richard Feynman, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, the new new thing, the scientific method, undersea cable, urban renewal, wikimedia commons
While there, Draper claims, he taught the art of phone phreaking to dozens of other inmates. Draper soon went to work for his friend Steve Wozniak at Apple Computer, designing an innovative product called the Charley Board. Charley was an add-in circuit board for the Apple II that connected the computer to the telephone line. With Charley and a few simple programs you could make your Apple II do all sorts of telephonic tricks. Not only could it dial telephone numbers and send touch tones down the line, it could even listen to the calls it placed and recognize basic telephone signals as the call progressed, signals such as a dial tone or busy signal or a ringing signal. With the right programming it could be used as a modem. An Apple II with a Charley Board, in fact, became the ultimate phone phreaking tool. Just as the phone company thought it was natural to mix computers and phone switches, John Draper thought it was natural to mix computers and phone phreaking.
Every one needed hardware and software hackers to help them. Riches, or promises of riches, or maybe just a fun job that might pay the bills beckoned. In 1976 former phone phreaks Steve Jobs and Steve Wozniak were selling Apple I computers to their fellow hobbyists. “Jobs placed ads in hobbyist publications and they began selling Apples for the price of $666.66,” journalist Steven Levy wrote. “Anyone in Homebrew could take a look at the schematics for the design, Woz’s BASIC was given away free with the purchase of a piece of equipment that connected the computer to a cassette recorder.” The fully assembled and tested Apple II followed later that year. By 1977 microcomputers had begun to enter the mainstream. You could stroll down to your local Radio Shack and buy a TRS-80 microcomputer off the shelf, something absolutely unheard of just a year earlier.
Excerpt from IWOZ: COMPUTER GEEK TO CULT ICON: HOW I INVENTED THE PERSONAL COMPUTER, COFOUNDED APPLE, AND HAD FUN DOING IT by Steve Wozniak and Gina Smith. Copyright © 2006 by Steve Wozniak and Gina Smith. Used by permission of W. W. Norton & Company, Inc. ISBN-13: 978-0-8021-9375-9 Grove Press an imprint of Grove/Atlantic, Inc. 841 Broadway New York, NY 10003 Distributed by Publishers Group West www.groveatlantic.com 13 14 15 16 10 9 8 7 6 5 4 3 2 1 To the men and women of the Bell System, and especially to the members of the technical staff of Bell Laboratories, without whom none of this would have been possible CONTENTS FOREWORD BY STEVE WOZNIAK A NOTE ON NAMES AND TENSES CHAPTER 1 FINE ARTS 13 CHAPTER 2 BIRTH OF A PLAYGROUND CHAPTER 3 CAT AND CANARY CHAPTER 4 THE LARGEST MACHINE IN THE WORLD CHAPTER 5 BLUE BOX CHAPTER 6 “SOME PEOPLE COLLECT STAMPS” CHAPTER 7 HEADACHE CHAPTER 8 BLUE BOX BOOKIES CHAPTER 9 LITTLE JOJO LEARNS TO WHISTLE CHAPTER 10 BILL ACKER LEARNS TO PLAY THE FLUTE CHAPTER 11 THE PHONE FREAKS OF AMERICA PHOTO INSERT CHAPTER 12 THE LAW OF UNINTENDED CONSEQUENCES CHAPTER 13 COUNTERCULTURE CHAPTER 14 BUSTED CHAPTER 15 PRANKS CHAPTER 16 THE STORY OF A WAR CHAPTER 17 A LITTLE BIT STUPID CHAPTER 18 SNITCH CHAPTER 19 CRUNCHED CHAPTER 20 TWILIGHT CHAPTER 21 NIGHTFALL EPILOGUE SOURCES AND NOTES ACKNOWLEDGMENTS INDEX THE PLAYGROUND Phone phreak (n.) 1.
Founders at Work: Stories of Startups' Early Days by Jessica Livingston
8-hour work day, affirmative action, AltaVista, Apple II, Brewster Kahle, business cycle, business process, Byte Shop, Danny Hillis, David Heinemeier Hansson, don't be evil, fear of failure, financial independence, Firefox, full text search, game design, Googley, HyperCard, illegal immigration, Internet Archive, Jeff Bezos, Joi Ito, Justin.tv, Larry Wall, Maui Hawaii, Menlo Park, Mitch Kapor, nuclear winter, Paul Buchheit, Paul Graham, Peter Thiel, Richard Feynman, Robert Metcalfe, Ruby on Rails, Sam Altman, Sand Hill Road, side project, Silicon Valley, slashdot, social software, software patent, South of Market, San Francisco, Startup school, stealth mode startup, Steve Ballmer, Steve Jobs, Steve Wozniak, web application, Y Combinator
Little did he know that I was actually up all night writing a business plan, not partying. C H A P T 3 E R Steve Wozniak Cofounder, Apple Computer If any one person can be said to have set off the personal computer revolution, it might be Steve Wozniak. He designed the machine that crystallized what a desktop computer was: the Apple II. Wozniak and Steve Jobs founded Apple Computer in 1976. Between Wozniak’s technical ability and Jobs’s mesmerizing energy, they were a powerful team. Woz first showed off his home-built computer, the Apple I, at Silicon Valley’s Homebrew Computer Club in 1976. After Jobs landed a contract with the Byte Shop, a local computer store, for 100 preassembled machines, Apple was launched on a rapid ascent. Woz soon followed with the machine that made the company: the Apple II. He single-handedly designed all its hardware and software—an extraordinary feat even for the time.
So I said to him, “Now that games are software, it’s going to be a different world for games.” And the Apple II, so many people just started trying to figure out how can you get rocket ships to launch, how can you get things that sound like sound when you have a real cruddy voltage to a speaker. How do you listen to somebody talk and figure out what they said? They started using the Apple II. It was just open to all these things. We made it easy for anyone to do what they wanted to do. And I think that was one of the biggest keys to its success. We didn’t make it a hidden machine that we own—we sell it, it does this, you got it—like Commodore and RadioShack did. We put out manuals that had just hundreds of pages of listings of code, descriptions of circuits, examples of boards that you would plug in—so that Steve Wozniak 51 anyone could look at this and say, “Now I know how I would do my own.”
I can’t remember the name of the company out of the East, but a venture group. They came in and met us all early on, and they did put in . . . Mike figured out that we were going to need some cash, we were going to be so fast growing. And when you are fast growing, you need more cash right away. So we did have a venture deal in place from well before we shipped an Apple II. And sometime after we were shipping the Apple IIs, we got, I think, $800,000 or $300,000—some large amount—from one venture capital place. Steve Wozniak 57 Livingston: On the East Coast? Wozniak: I believe that’s where we arranged it. Mike Markkula had worked with this guy Hank Smith at Intel, so that’s how they knew each other. And I think Don Valentine actually put some money in, but then it came to a point where he wanted to make some good money and buy some stock off Steve Jobs for like $5.50 before we went public. $5.50 a share, and Steve thought it was too low.
The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution by Walter Isaacson
1960s counterculture, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, Apple II, augmented reality, back-to-the-land, beat the dealer, Bill Gates: Altair 8800, bitcoin, Bob Noyce, Buckminster Fuller, Byte Shop, c2.com, call centre, citizen journalism, Claude Shannon: information theory, Clayton Christensen, commoditize, computer age, crowdsourcing, cryptocurrency, Debian, desegregation, Donald Davies, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, en.wikipedia.org, Firefox, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, Haight Ashbury, Howard Rheingold, Hush-A-Phone, HyperCard, hypertext link, index card, Internet Archive, Jacquard loom, Jaron Lanier, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Joseph-Marie Jacquard, Leonard Kleinrock, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, Mother of all demos, new economy, New Journalism, Norbert Wiener, Norman Macrae, packet switching, PageRank, Paul Terrell, pirate software, popular electronics, pre–internet, RAND corporation, Ray Kurzweil, RFC: Request For Comment, Richard Feynman, Richard Stallman, Robert Metcalfe, Rubik’s Cube, Sand Hill Road, Saturday Night Live, self-driving car, Silicon Valley, Silicon Valley startup, Skype, slashdot, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Steven Pinker, Stewart Brand, technological singularity, technoutopianism, Ted Nelson, The Coming Technological Singularity, The Nature of the Firm, The Wisdom of Crowds, Turing complete, Turing machine, Turing test, Vannevar Bush, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Whole Earth Review, wikimedia commons, William Shockley: the traitorous eight
When Wired magazine featured maker culture in its April 2011 issue, it put a woman engineer on its cover for the first time, the MIT-trained do-it-yourself entrepreneur Limor Fried, whose moniker “ladyada” and company name Adafruit Industries were homages to Ada Lovelace. 31. To listen to Dompier’s Altair play “Fool on the Hill,” go to http://startup.nmnaturalhistory.org/gallery/story.php?ii=46. 32. After they became successful, Gates and Allen donated a new science building to Lakeside and named its auditorium after Kent Evans. 33. Steve Wozniak’s unwillingness to tackle this tedious task when he wrote BASIC for the Apple II would later force Apple to have to license BASIC from Allen and Gates. 34. Reading a draft version of this book online, Steve Wozniak said that Dan Sokol made only eight copies, because they were hard and time-consuming to make. But John Markoff, who reported this incident in What the Dormouse Said, shared with me (and Woz and Felsenstein) the transcript of his interview with Dan Sokol, who said he used a PDP-11 with a high-speed tape reader and punch.
Ethernet developed by Bob Metcalfe at Xerox PARC. Community Memory shared terminal set up at Leopold’s Records, Berkeley. Vint Cerf and Bob Kahn complete TCP/IP protocols for the Internet. 1974 Intel 8080 comes out. 1975 Altair personal computer from MITS appears. Paul Allen and Bill Gates write BASIC for Altair, form Microsoft. First meeting of Homebrew Computer Club. Steve Jobs and Steve Wozniak launch the Apple I. 1977 The Apple II is released. 1978 First Internet Bulletin Board System. 1979 Usenet newsgroups invented. Jobs visits Xerox PARC. 1980 IBM commissions Microsoft to develop an operating system for PC. 1981 Hayes modem marketed to home users. 1983 Microsoft announces Windows. Richard Stallman begins developing GNU, a free operating system. 2011 1984 Apple introduces Macintosh. 1985 Stewart Brand and Larry Brilliant launch The WELL.
Harold Singer, “Open Letter to Ed Roberts,” Micro-8 Computer User Group newsletter, Mar. 28, 1976. 80. Author’s interview with Lee Felsenstein. 81. Bill Gates interview, Playboy, July 1994. 82. This section draws from my Steve Jobs (Simon & Schuster, 2011), which was based on interviews with Steve Jobs, Steve Wozniak, Nolan Bushnell, Al Alcorn, and others. The Jobs biography includes a bibliography and source notes. For this book, I reinterviewed Bushnell, Alcorn, and Wozniak. This section also draws on Steve Wozniak, iWoz (Norton, 1984); Steve Wozniak, “Homebrew and How the Apple Came to Be,” http://www.atariarchives.org/deli/homebrew_and_how_the_apple.php. 83. When I posted an early draft of parts of this book for crowdsourced comments and corrections on Medium, Dan Bricklin offered useful suggestions. We got into an exchange about the creation of VisiCalc, and I subsequently added this section to the book.
Troublemakers: Silicon Valley's Coming of Age by Leslie Berlin
AltaVista, Apple II, Asilomar, Asilomar Conference on Recombinant DNA, beat the dealer, Bill Gates: Altair 8800, Bob Noyce, Byte Shop, Clayton Christensen, cloud computing, computer age, discovery of DNA, don't be evil, Donald Knuth, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Thorp, El Camino Real, fear of failure, Fellow of the Royal Society, financial independence, game design, Haight Ashbury, hiring and firing, industrial robot, informal economy, Internet of things, inventory management, John Markoff, Kickstarter, Kitchen Debate, Leonard Kleinrock, manufacturing employment, Mark Zuckerberg, Menlo Park, Minecraft, Mother of all demos, packet switching, Ralph Nader, Robert Metcalfe, rolodex, Ronald Reagan, Sand Hill Road, Silicon Valley, Silicon Valley startup, Snapchat, software as a service, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, union organizing, upwardly mobile, William Shockley: the traitorous eight, women in the workforce
Gregg Williams and Rob Moore, “Guide to the Apple,” Byte, December 1984. 17. Harriet Stix, “A UC Berkeley Degree Is Now the Apple of Steve Wozniak’s Eye,” Los Angeles Times, May 14, 1986. 18. Marilyn Chase, “Technical Flaws Plague Apple’s New Computer,” Wall Street Journal, April 15, 1981. Apple III prices ranged from $4,300 to nearly $8,000, compared to the Apple II systems at about half the cost. 19. Apple fixed the problems and brought the Apple III back in late 1981—“Let me re-introduce myself,” one advertisement began—but not much software was written for the machine, and it was not anywhere near as popular as the Apple II. (Sales were around 1,000 per month versus the Apple II’s 15,000.) By one estimate (Brent Schlender and Rick Tetzeli, Becoming Steve Jobs: The Evolution of a Reckless Upstart into a Visionary Leader [New York: Crown Business, 2015]: 72), before the Apple III was discontinued in 1984, only 120,000 had been sold.
President Mike Scott had never run a company; as he explained years later, “I approached Apple as a chance to see if all the things I had learned about management would actually work.”50 Sales head Gene Carter had never worked in sales.51 Cofounder Steve Jobs, the vice president of administration, had only administered a company (Apple) with fewer workers than could be counted on one hand. The only people who seemed obvious choices for their jobs were Steve Wozniak, who ran engineering, and Mike Scott, in his capacity as manufacturing head. But it worked. Occasionally lost in the stories of Jobs’s bare feet and Wozniak’s practical joking is the reality of how hard and effectively the young men and their older semiconductor colleagues worked together. The birth of Apple’s floppy disk drive almost exactly a year after the company incorporated, and six months after it began shipping Apple IIs, offers a fine illustration. The Apple II was selling well, but Markkula was convinced it would never make the leap to a broad consumer market unless there were some way to make it load software faster.
“Atari’s meltdown created a tsunami that wiped out public interest in games . . . and gave gaming a stigma that lasted a decade,” says Trip Hawkins.12 Hawkins, one of Mike Markkula’s “diamonds in the rough” at Apple, had left the company in 1982 to found Electronic Arts to build video games for personal computers. The first outside backer of Electronic Arts was Don Valentine, the venture capitalist who had funded Atari and introduced Mike Markkula to Steve Jobs and Steve Wozniak. Ben Rosen, the optimistic technology analyst, was another early backer.IV Electronic Arts had its first big hit in October 1983 with Doctor J and Larry Bird Go One-on-One, a basketball game featuring two of the sport’s biggest stars. The title was one of the most popular games for the Apple II and an early indicator of the enormous market for licensed sports video games. (Electronic Arts would exploit this market with great success five years later with the release of John Madden Football.III) But in the wake of the 1983 crash, even Electronic Arts, which made games for computers, not dedicated consoles or arcades, had to retrench.
Commodore: A Company on the Edge by Brian Bagnall
Apple II, belly landing, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, Douglas Engelbart, Douglas Engelbart, Firefox, game design, index card, inventory management, Isaac Newton, low skilled workers, Menlo Park, packet switching, pink-collar, popular electronics, prediction markets, pre–internet, QWERTY keyboard, Robert Metcalfe, Robert X Cringely, Silicon Valley, special economic zone, Steve Jobs, Steve Wozniak, Ted Nelson
Unfortunately, the Apple II emitted strong radio interference signals, which the FCC does not allow in consumer devices. To get around the rules, Apple pretended it did not intend the Apple II for the home market. According to Yannes, “They basically said, ‘Okay, this is a data processing device and therefore the Class A FCC rules apply to it (the more relaxed rules) because this is going to be used in an industrial environment and not in a home.’ They couldn’t put an R/F modulator in it to hook to your TV set because obviously that was something for the home.” Without an R/F modulator, the Apple II was too complicated for inexperienced users. “The PET and the TRS-80 both came with their own monitors, so they were a more appropriate solution for most people than the Apple II was,” says Yannes. The original design by Steve Wozniak also had several flaws. “Right after the Apple II came out, Electronic Engineering Times wrote a story about the three major design flaws that Woz made on the Apple II,” says Peddle.
Steve Jobs was trying to talk Commodore into buying the Apple II for a large amount, like hundreds of thousands of dollars. … Steve Jobs also wanted Commodore to hire us along with the proposed deal. The deal was never on paper and never concrete, as to how much.” “The discussions never got beyond a meeting with Jack and Steve,” says Peddle. “Jack’s view was that Steve wanted much too much for the company and in the fall of 1976 he was right. I remember him laughing about Steve Jobs and his view of his company’s position.” Tramiel was willing to purchase Apple, but he wanted the lowest price possible. To do that, he put pressure on Jobs by refusing his initial offer. “Basically Jack decided to go along with it, but he tried to squeeze Steve,” explains Peddle. Steve Wozniak tells a different story. “We were told that Chuck wanted to do his own thing and that he could do better than us at reaching the cheap needs of customers.
 According to Herd, “Russell was later rewarded with a $20,000 bonus for coming up with this solution.”  Compute! Magazine, October 1982.  One of these users was Linus Torvalds. In 1981, the nine-year-old received a VIC-20 from his grandfather and used it to learn BASIC programming. He was taking the first steps which eventually led him to create a revolution with Linux.  Time magazine, “The Hottest-Selling Hardware” (January 3, 1983), p. 37.  Steve Wozniak has attempted to claim the Apple II was the first to a million. On BBC World’s Most Powerful, aired December 2003, Wozniak claimed, “Sales shot sky high. Apple was the first company to sell a hundred thousand computers—a million computers.” CHAPTER 29 Selling the Revolution 1982 With the engineering job complete (or at least good enough), Charlie Winterble’s team reluctantly stepped away from the Commodore 64.
Start With Why: How Great Leaders Inspire Everyone to Take Action by Simon Sinek
Apple II, Apple's 1984 Super Bowl advert, Black Swan, business cycle, commoditize, hiring and firing, John Markoff, low cost airline, Nick Leeson, RAND corporation, risk tolerance, Ronald Reagan, shareholder value, Steve Ballmer, Steve Jobs, Steve Wozniak, The Wisdom of Crowds, trade route
They hung out with hippie types who shared their beliefs, but they saw a different way to change the world that didn’t require protesting or engaging in anything illegal. Steve Wozniak and Steve Jobs came of age in this time. Not only was the revolutionary spirit running high in Northern California, but it was also the time and place of the computer revolution. And in this technology they saw the opportunity to start their own revolution. “The Apple gave an individual the power to do the same things as any company,” Wozniak recounts. “For the first time ever, one person could take on a corporation simply because they had the ability to use the technology.” Wozniak engineered the Apple I and later the Apple II to be simple enough for people to harness the power of the technology. Jobs knew how to sell it. Thus was born Apple Computer. A company with a purpose—to give the individual to power to stand up to established power.
New York: Farrar, Straus and Giroux, 2005. 140 “If it hadn’t been for my big brother”: Bob Thomas, Building a Company: Roy O. Disney and the Creation of an Entertainment Empire. New York: Disney Editions, 1998. 142 Herb Kelleher was able to personify and preach the cause of freedom: Kevin Freiberg and Jackie Freiberg, Nuts! Southwest Airlines’ Crazy Recipe for Business and Personal Success. New York: Broadway, 1998. 142 Steve Wozniak is the engineer who made the Apple work: Steve Wozniak, personal interview, November 2008. 143 Bill Gates and Paul Allen went to high school together in Seattle: Randy Alfred, “April 4, 1975: Bill Gates, Paul Allen Form a Little Partnership,” Wired, April 4, 1975, http://www.wired.com/science/discoveries/news/2008/04/dayintech_0404. 145 Oprah Winfrey once gave away a free car: Ann Oldenburg, “7M car giveaway stuns TV audience,” USA Today, September 13, 2004, http://www.usatoday.com/life/people/2004-09-13-oprah-cars_x.htm. 150 the Education for Employment Foundation: http://www.efefoundation.org/homepage.html; Lisa Takeuchi Cullen, “Gainful Employment,” Time, September 20, 2007, http://www.time.com/time/magazine/article/0,9171,1663851,00.html; Ron Bruder, personal interview, February 2009.
But it wasn’t until 1976, nearly three years after the end of America’s military involvement in the Vietnam conflict, that a different revolution ignited. They aimed to make an impact, a very big impact, even challenge the way people perceived how the world worked. But these young revolutionaries did not throw stones or take up arms against an authoritarian regime. Instead, they decided to beat the system at its own game. For Steve Wozniak and Steve Jobs, the cofounders of Apple Computer, the battlefield was business and the weapon of choice was the personal computer. The personal computer revolution was beginning to brew when Wozniak built the Apple I. Just starting to gain attention, the technology was primarily seen as a tool for business. Computers were too complicated and out of the price range of the average individual.
The Master Switch: The Rise and Fall of Information Empires by Tim Wu
accounting loophole / creative accounting, Alfred Russel Wallace, Apple II, barriers to entry, British Empire, Burning Man, business cycle, Cass Sunstein, Clayton Christensen, commoditize, corporate raider, creative destruction, disruptive innovation, don't be evil, Douglas Engelbart, Douglas Engelbart, Howard Rheingold, Hush-A-Phone, informal economy, intermodal, Internet Archive, invention of movable type, invention of the telephone, invisible hand, Jane Jacobs, John Markoff, Joseph Schumpeter, Menlo Park, open economy, packet switching, PageRank, profit motive, road to serfdom, Robert Bork, Robert Metcalfe, Ronald Coase, sexual politics, shareholder value, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, Telecommunications Act of 1996, The Chicago School, The Death and Life of Great American Cities, the market place, The Wisdom of Crowds, too big to fail, Upton Sinclair, urban planning, zero-sum game
The Apple’s operating system, using a form of BASIC as its programming language and operating environment, was, moreover, one that anyone could program. It made it possible to write and sell one’s programs directly, creating what we now call the “software” industry. In 2006, I briefly met with Steve Wozniak on the campus of Columbia University. “There’s a question I’ve always wanted to ask you,” I said. “What happened with the Mac? You could open up the Apple II, and there were slots and so on, and anyone could write for it. The Mac was way more closed. What happened?” “Oh,” said Wozniak. “That was Steve. He wanted it that way. The Apple II was my machine, and the Mac was his.” Apple’s origins were pure Steve Wozniak, but as everyone knows, it was the other founder, Steve Jobs, whose ideas made Apple what it is today. Jobs maintained the early image that he and Wozniak created, but beginning with the Macintosh in the 1980s, and accelerating through the age of the iPod, iPhone, and iPad, he led Apple computers on a fundamentally different track.
And it was Wozniak who would conceive of and build the Apple and the Apple II, the most important Apple products ever, and arguably among the most important inventions of the later twentieth century.* For his part, Jobs was the businessman and the dealmaker of the operation, essential as such, but hardly the founding genius of Apple computers, the man whose ideas were turned into silicon to change the world; that was Wozniak. The history of the firm must be understood in this light. For while founders do set the culture of a firm, they cannot dictate it in perpetuity; as Wozniak withdrew from the operation, Apple became more and more concerned with, as it were, the aesthetics of radicalism than with its substance. Steve Wozniak is not the household name that Steve Jobs is, but his importance to communications and culture in the postwar period merits a closer look.
Now as they spoke, a warm glow began to develop between a Montague and a Capulet who fantasized about all-embracing alliance between their seemingly irreconcilable houses. Theirs was to be a union that could move mountains—or at least break down the old barriers and create a perfect new world.2 The two moguls plotting the future of the Internet had something else in common: neither was what you might call a natural computer geek, in the manner of Bill Gates or Steve Jobs. Entrepreneurs like Apple’s Steve Wozniak got started by programming and soldering; Case was an assistant brand manager at Procter & Gamble in Kansas. He might have languished somewhere in upper middle management had he not resolved to grab the ring. Case took a job at a risky computer networking firm named the Control Video Corporation that had already failed twice. Three’s a charm, however: by some miracle, that firm eventually managed to become America Online.3 Once a corporate lawyer, Levin during these years was working as a cable executive.
Becoming Steve Jobs: The Evolution of a Reckless Upstart Into a Visionary Leader by Brent Schlender, Rick Tetzeli
Albert Einstein, Apple II, Apple's 1984 Super Bowl advert, Bill Gates: Altair 8800, Bob Noyce, Byte Shop, Charles Lindbergh, computer age, corporate governance, El Camino Real, Isaac Newton, John Markoff, Jony Ive, Kickstarter, Marc Andreessen, market design, McMansion, Menlo Park, Paul Terrell, popular electronics, QWERTY keyboard, Ronald Reagan, Sand Hill Road, side project, Silicon Valley, Silicon Valley startup, skunkworks, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Tim Cook: Apple, Wall-E, Watson beat the top human players on Jeopardy!, Whole Earth Catalog
Part of my unease came from the fact that, for the first time in my experience as a journalist, I would be calling on a prominent business leader who was younger than I. I was thirty-two years old; Jobs was thirty-one and already a global celebrity, hailed, along with Bill Gates, for having invented the personal computer industry. Long before Internet mania started churning out wunderkinds of the week, Jobs was technology’s original superstar, the real deal with an astounding, substantial record. The circuit boards he and Steve Wozniak had assembled in a garage in Los Altos had spawned a billion-dollar company. The personal computer seemed to have unlimited potential, and as the cofounder of Apple Computer, Steve Jobs had been the face of all those possibilities. But then, in September of 1985, he had resigned under pressure, shortly after telling the company’s board of directors that he was courting some key Apple employees to join him in a new venture to build computer “workstations.”
Rock loved order, he loved processes, he believed that tech companies grew in certain ways according to certain rules, and he subscribed to these beliefs because he’d seen them work before, most notably at Intel, the great Santa Clara chipmaker that he had backed early on. Rock was perhaps the most notable tech investor of his time, but he in fact had been reluctant to back Apple at first, largely because he’d found Steve and his partner Steve Wozniak unpalatable. He didn’t see Apple the way Jobs saw it—as an extraordinary company that would humanize computing and do so with a defiantly unhierarchical organization. Rock simply viewed it as another investment. Steve found board meetings with Rock enervating, not invigorating; he had looked forward to a long, fast drive to Marin with the top down to get rid of the stale stench of seemingly endless discussion.
For the third time in three years, he tried to hire Lasseter away from Pixar. Lasseter wouldn’t go. “I was living in the San Francisco Bay Area,” he remembers. “I was inventing new stuff. I figured I’d just stay on here. I’d had a pretty miserable experience at Disney.” He told Schneider that there was only one way that he’d consider working with Disney—the studio would have to make a movie with Pixar. The Evolution of a CEO Steve Jobs and Steve Wozniak in 1979. The two had founded Apple four years earlier, and the company was growing like crazy. But the best years of their collaboration were already over. Ted Thai/Polaris A 1979 gathering of the Seva Foundation, which Steve backed with a $5,000 donation. His close friend Larry Brilliant is at the center with his baby boy, Joseph; Brilliant’s wife, Girija, is to the right, arms crossed and leaning back.
Inventors at Work: The Minds and Motivation Behind Modern Inventions by Brett Stern
Apple II, augmented reality, autonomous vehicles, bioinformatics, Build a better mousetrap, business process, cloud computing, computer vision, cyber-physical system, distributed generation, game design, Grace Hopper, Richard Feynman, Silicon Valley, skunkworks, Skype, smart transportation, speech recognition, statistical model, stealth mode startup, Steve Jobs, Steve Wozniak, the market place, Yogi Berra
Calvert: These are the things that an inventor really needs to learn and work out before taking that big step of sending a patent application to the USPTO—hopefully as the prelude to starting up their own business. 1 www.uspto.gov/inventors/independent/eye/201206/index.jsp 2 www.uiausa.org CHAPTER 23 Steve Wozniak Co-Founder Apple Computer A Silicon Valley icon and philanthropist for more than thirty years, Steve Wozniak helped shape the computing industry with his design of Apple’s first line of products, the Apple I and II, and influenced the popular Macintosh. In 1976, Wozniak and Steve Jobs founded Apple Computer Inc. with Wozniak’s Apple I personal computer. The following year he introduced his Apple II personal computer, featuring a central processing unit, a keyboard, color graphics, and a floppy disk drive. The Apple II was integral to launching the personal computer industry. Wozniak is named sole inventor on the US patent for “microcomputer for use with video display.”
Tim Leatherman, Folding Hand Tools Chapter 15. Reyn Guyer, Toys Chapter 16. Bernhard van Lengerich, Food Manufacturing Chapter 17. Curt Croley, Shane MacGregor, Graham Marshall, Mobile Devices Chapter 18. Matthew Scholz, Healthcare Products Chapter 19. Daria Mochly-Rosen, Drugs Chapter 20. Martin Keen, Footwear Chapter 21. Kevin Deppermann, Seed Genomes Chapter 22. John Calvert, Elizabeth Dougherty, USPTO Chapter 23. Steve Wozniak, Personal Computers Index About the Author Brett Stern is an industrial designer and inventor living in Portland, Oregon. He holds eight utility patents covering surgical instruments, medical implants, and robotic garmentmanufacturing systems. He holds trademarks in 34 countries on a line of snack foods that he created. He has worked as an industrial design consultant for such clients as Pfizer, Revlon, and Saatchi & Saatchi, and as a costume materials technologist for Warner Bros.
Today, at the dawn of the nexus of the future, ideas for inventions stand only a small chance of being realized and competing in the marketplace unless they’re generated or picked up by corporations that can marshal teams of scientists and lawyers underwritten by enterprise-scale capital and infrastructure. Nonetheless, millions of individuals still cherish the dream of inventing and building a better mousetrap, bringing it to market, and being richly rewarded for those efforts. Americans love their pantheon of garage inventors. Thomas Edison, the Wright Brothers, Alexander Graham Bell, Bill Hewlett and Dave Packard, and Steve Wozniak and Steve Jobs are held up as culture heroes, celebrated for their entrepreneurial spirit no less than their inventive genius. This book is a collection of interviews conducted with individuals who have distinguished themselves in the invention space. Some of the inventors interviewed here have their Aha! moments in government, institutional, or industrial labs; develop their inventions with multidisciplinary teams of experts; and leave the marketing of their inventions to other specialists in the organization.
Equal Is Unfair: America's Misguided Fight Against Income Inequality by Don Watkins, Yaron Brook
3D printing, Affordable Care Act / Obamacare, Apple II, barriers to entry, Berlin Wall, Bernie Madoff, blue-collar work, business process, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, collective bargaining, colonial exploitation, corporate governance, correlation does not imply causation, creative destruction, Credit Default Swap, crony capitalism, David Brooks, deskilling, Edward Glaeser, Elon Musk, en.wikipedia.org, financial deregulation, immigration reform, income inequality, indoor plumbing, inventory management, invisible hand, Isaac Newton, Jeff Bezos, Jony Ive, laissez-faire capitalism, Louis Pasteur, low skilled workers, means of production, minimum wage unemployment, Naomi Klein, new economy, obamacare, Peter Singer: altruism, Peter Thiel, profit motive, rent control, Ronald Reagan, Silicon Valley, Skype, statistical model, Steve Jobs, Steve Wozniak, The Spirit Level, too big to fail, trickle-down economics, Uber for X, urban renewal, War on Poverty, wealth creators, women in the workforce, working poor, zero-sum game
Spending on the Basics as a Share of Disposable Personal Income,” HumanProgress.org, http://humanprogress.org/static/us-spending-on-basics (accessed April 13, 2015). 5. Steve Wozniak with Gina Smith, iWoz: Computer Geek to Cult Icon (New York: Norton, 2006), pp. 12–13. 6. Ibid., p. 18. 7. Ibid., pp. 54–55. 8. Ibid., pp. 155–56. 9. “National Inventors Hall of Fame,” Ohio History Central, http://www.ohiohistorycentral.org/w/National_Inventors_Hall_of_Fame?rec=1727 (accessed August 31, 2015). 10. Quoted in Sean Rossman, “Apple’s ‘The Woz’ Talks Jobs, Entrepreneurship,” Tallahassee Democrat, November 6, 2014, http://www.tallahassee.com/story/news/local/2014/11/05/apples-woz-talks-jobs-entrepreneurship/18561425/ (accessed April 13, 2015). 11. Quoted in Alec Hogg, “Apple’s ‘Other’ Steve—Wozniak on Jobs, Starting a Business, Changing the World, and Staying Hungry, Staying Foolish,” BizNews.com, February 17, 2014, http://www.biznews.com/video/2014/02/17/apples-other-steve-wozniak-on-jobs-starting-a-business-changing-the-world/ (accessed April 13, 2015). 12.
But we do live on a Glorious Earth, where we can make life amazing. And it can be amazing for everyone, because it turns out that the way we improve our lives—ingenuity and effort—is not a fixed-sum game, where we battle over a static amount of wealth. We produce wealth, and there is no limit to how much wealth we can produce. Who Created the Modern World? In his autobiography, Apple cofounder Steve Wozniak, or Woz, as he’s usually called, describes how his dad, an engineer, would explain to the four-year-old Woz how electronics worked. “I remember sitting there and being so little, and thinking: ‘Wow, what a great, great world he’s living in,’” Woz recalls. “I mean, that’s all I thought: ‘Wow.’ For people who know how to do this stuff—how to take these little parts and make them work together to do something—well, these people must be the smartest people in the world. . . .
Maybe you teach music to middle-schoolers. Maybe you fix cars or perform brain surgery or, God help you, write books on inequality. Whatever it is, you do productive work in exchange for money, which you use to buy the dizzying array of things that other people produce. But that’s only part of the story. Not all work is equally productive. Some of us create a little wealth. Some of us create a lot. A tiny handful, like Steve Wozniak, create so much that their names go down in the history books. Think of some of the things that make your life wonderful. Your cell phone, your computer, the Internet? You can thank Robert Noyce and Jack Kilby, who invented the integrated circuit. The car that took you to work? You can thank Henry Ford, who transformed the automobile from a curiosity of the rich into a mass-market product.
The Code: Silicon Valley and the Remaking of America by Margaret O'Mara
"side hustle", A Declaration of the Independence of Cyberspace, accounting loophole / creative accounting, affirmative action, Airbnb, AltaVista, Amazon Web Services, Apple II, Apple's 1984 Super Bowl advert, autonomous vehicles, back-to-the-land, barriers to entry, Ben Horowitz, Berlin Wall, Bob Noyce, Buckminster Fuller, Burning Man, business climate, Byte Shop, California gold rush, carried interest, clean water, cleantech, cloud computing, cognitive dissonance, commoditize, computer age, continuous integration, cuban missile crisis, Danny Hillis, DARPA: Urban Challenge, deindustrialization, different worldview, don't be evil, Donald Trump, Doomsday Clock, Douglas Engelbart, Dynabook, Edward Snowden, El Camino Real, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Frank Gehry, George Gilder, gig economy, Googley, Hacker Ethic, high net worth, Hush-A-Phone, immigration reform, income inequality, informal economy, information retrieval, invention of movable type, invisible hand, Isaac Newton, Jeff Bezos, Joan Didion, job automation, job-hopping, John Markoff, Julian Assange, Kitchen Debate, knowledge economy, knowledge worker, Lyft, Marc Andreessen, Mark Zuckerberg, market bubble, mass immigration, means of production, mega-rich, Menlo Park, Mikhail Gorbachev, millennium bug, Mitch Kapor, Mother of all demos, move fast and break things, move fast and break things, mutually assured destruction, new economy, Norbert Wiener, old-boy network, pattern recognition, Paul Graham, Paul Terrell, paypal mafia, Peter Thiel, pets.com, pirate software, popular electronics, pre–internet, Ralph Nader, RAND corporation, Richard Florida, ride hailing / ride sharing, risk tolerance, Robert Metcalfe, Ronald Reagan, Sand Hill Road, Second Machine Age, self-driving car, shareholder value, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, skunkworks, Snapchat, social graph, software is eating the world, speech recognition, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, supercomputer in your pocket, technoutopianism, Ted Nelson, the market place, the new new thing, There's no reason for any individual to have a computer in his home - Ken Olsen, Thomas L Friedman, Tim Cook: Apple, transcontinental railway, Uber and Lyft, uber lyft, Unsafe at Any Speed, upwardly mobile, Vannevar Bush, War on Poverty, We wanted flying cars, instead we got 140 characters, Whole Earth Catalog, WikiLeaks, William Shockley: the traitorous eight, Y Combinator, Y2K
In a replay of his audacious call to Bill Hewlett a decade earlier, Jobs dialed up the Intel switchboard, where someone connected him with the man who’d crafted that marketing campaign, Regis McKenna Himself. McKenna was unfazed by Apple’s garage setting and the co-founders’ scraggly looks. He’d worked with “lots of strange people” in the Valley already, and he was familiar with the Homebrew scene and the intriguing little enterprises bubbling up from it. The first meeting, however, was a bust. The Steves wanted help placing a Woz-authored article on the Apple II in Byte. It turned out that Steve Wozniak was much better at building elegant motherboards than crafting accessible prose; the piece was a rambling mess better suited for the hobbyist crowd back over at Dr. Dobb’s. McKenna told them it would have to be rewritten, and an offended Woz refused. Then I have nothing to offer you, replied McKenna.9 But Steve Jobs was not one to take no for an answer. Ever persistent, he called McKenna “about forty times” to persuade him to take Apple on as a client.
“Six people had already built their own computers, and almost everyone else wanted to.”3 The meeting attracted many of the usual suspects. Lee Felsenstein drove down from Berkeley. But it also drew in some new faces. Coming in from Cupertino was a former phone phreaker who’d spent his college years selling marginally legal “blue boxes” door to door in his dorm with a high school buddy. His name was Steve Wozniak, and his buddy’s name was Steve Jobs.4 Part swap meet, part intelligence gathering, part networking session, the biweekly Homebrew meetings quickly morphed into a local phenomenon. The second meeting moved from French’s garage to John McCarthy’s Stanford artificial-intelligence operation, then spilled out to the auditorium at the Stanford Linear Accelerator Center on Sand Hill Road, attracting hundreds of people each month.
Featuring shakily hand-drawn portraits of club members (beards, long hair, and Coke-bottle glasses predominated), first-name references to meeting participants, and rough-and-ready page layouts, the newsletters echoed the PCC in their chatty informality, even as the Homebrew Computer Club grew in size and influence. Liza Loop stood out in the crowd. She was the only woman on the early Homebrew membership roster, and she was a computer newbie. To encourage swapping and sharing, Moore’s newsletters included blurbs from members about their skills and needs. Steve Wozniak’s was typical, showcasing dizzying technical virtuosity: “have TVT[ypewriter] of my own design . . . have my own version of Pong,” he wrote. “Working on a 17 chip TV chess display (includes 3 stored boards); a 30 chip TV display. Skills: digital design, interfacing, I/O devices, short on time, have schematics.” In contrast, Loop wrote: “I am not primarily a computer person. So my greatest contribution is to help professionals communicate with total laymen and kids.
A People’s History of Computing in the United States by Joy Lisi Rankin
activist fund / activist shareholder / activist investor, Albert Einstein, Apple II, Bill Gates: Altair 8800, computer age, corporate social responsibility, Douglas Engelbart, Douglas Engelbart, Grace Hopper, Hacker Ethic, Howard Rheingold, Howard Zinn, Jeff Bezos, John Markoff, John von Neumann, Mark Zuckerberg, Menlo Park, Mother of all demos, Network effects, Norbert Wiener, pink-collar, profit motive, RAND corporation, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, the market place, urban planning, Whole Earth Catalog, wikimedia commons
Gates’s letter marked the start of a shift from people computing to people consuming. Increasingly, people would have to purchase computers and software (now, devices and apps) for their personal and social computing. BASIC also figures prominently in the history of Apple. Steve Wozniak produced his own “Integer BASIC” for his homemade computer, built around MOS Technology’s 6502 microprocessor chip; he shared Integer BASIC , and he even published programs in Dr. Dobb’s Journal.29 When Wozniak’s high school chum Steve Jobs saw the computer, he proposed they team up to assemble and sell them. They named the computer Apple, and soon began working on a new version, the Apple II. Although Apple declared its philosophy 238 A People’s History of Computing in the United States was “to provide software for our machines f ree or at minimal cost,” Apple sought (aggressively) to sell its hardware.30 W hether they w ere called home computers, hobby computers, microcomputers, or personal computers, they were consumer products, purveyed by Steve Jobs.
International Business Machines, much more familiar as IBM, dominated the era when computers were the remote and room-size machines of the military-industrial complex. Then, around 1975, along came the California hobbyists who created personal computers and liberated us from the monolithic mainframes. They were young men in the greater San Francisco Bay Area, and they tinkered in their garages. They started companies: Steve Jobs and Steve Wozniak established Apple; Bill Gates and Paul Allen developed Microsoft. Then, in the 1990s, along came the Internet to connect all of t hose personal computers, and the people using them. Another round of eccentric nerds (still all young white men)—Jeff Bezos, Sergey Brin, Larry Page, and Mark Zuckerberg among them—gave us Amazon, Google, Facebook, and the fiefdoms of Silicon Valley. Walter Isaacson’s The Innovators expands the popular narrative of digital history to include less familiar contributors such as the nineteenth-century mathematician Charles Babbage and the twentieth-century computing visionary J.
I contend that each of the computing communities described in previous chapters struggled with the transition from computing citizenship to computing consumption. PLATO’s revolutionary plasma screens attracted the investment of the Control Data Corporation, which tried (unsuccessfully) to market its own version of the PLATO system to schools and universities. The BASIC programs shared freely around the Dartmouth network and on the pages of the People’s Computer Company newsletter fueled the imaginations of many—including Steve Wozniak and Bill Gates. Gates first learned to program in BASIC , the language on which he built his Microsoft empire. Wozniak adapted Tiny BASIC into Integer BASIC to program his homemade computer, the computer that attracted the partnership of Steve Jobs and launched Apple. And the Minnesota software library, mostly BASIC programs including The Oregon Trail, proved to be the ideal complement for the hardware of Apple Computers.
The Hacker Crackdown by Bruce Sterling
Apple II, back-to-the-land, game design, ghettoisation, Haight Ashbury, Howard Rheingold, HyperCard, index card, informal economy, Jaron Lanier, Mitch Kapor, pirate software, plutocrats, Plutocrats, Silicon Valley, Steve Wozniak, Steven Levy, Stewart Brand, The Hackers Conference, the scientific method, Whole Earth Catalog, Whole Earth Review
Before computers and their phone-line modems entered American homes in gigantic numbers, phone phreaks had their own special telecommunications hardware gadget, the famous "blue box." This fraud device (now rendered increasingly useless by the digital evolution of the phone system) could trick switching systems into granting free access to long-distance lines. It did this by mimicking the system's own signal, a tone of 2600 hertz. Steven Jobs and Steve Wozniak, the founders of Apple Computer, Inc., once dabbled in selling blue-boxes in college dorms in California. For many, in the early days of phreaking, blue-boxing was scarcely perceived as "theft," but rather as a fun (if sneaky) way to use excess phone capacity harmlessly. After all, the long-distance lines were JUST SITTING THERE.... Whom did it hurt, really? If you're not DAMAGING the system, and you're not USING UP ANY TANGIBLE RESOURCE, and if nobody FINDS OUT what you did, then what real harm have you done?
On the contrary, like most rock musicians, the Grateful Dead have spent their entire adult lives in the company of complex electronic equipment. They have funds to burn on any sophisticated tool and toy that might happen to catch their fancy. And their fancy is quite extensive. The Deadhead community boasts any number of recording engineers, lighting experts, rock video mavens, electronic technicians of all descriptions. And the drift goes both ways. Steve Wozniak, Apple's co-founder, used to throw rock festivals. Silicon Valley rocks out. These are the 1990s, not the 1960s. Today, for a surprising number of people all over America, the supposed dividing line between Bohemian and technician simply no longer exists. People of this sort may have a set of windchimes and a dog with a knotted kerchief 'round its neck, but they're also quite likely to own a multimegabyte Macintosh running MIDI synthesizer software and trippy fractal simulations.
Furthermore, proclaimed the manifesto, the foundation would "fund, conduct, and support legal efforts to demonstrate that the Secret Service has exercised prior restraint on publications, limited free speech, conducted improper seizure of equipment and data, used undue force, and generally conducted itself in a fashion which is arbitrary, oppressive, and unconstitutional." "Crime and Puzzlement" was distributed far and wide through computer networking channels, and also printed in the Whole Earth Review. The sudden declaration of a coherent, politicized counter-strike from the ranks of hackerdom electrified the community. Steve Wozniak (perhaps a bit stung by the NuPrometheus scandal) swiftly offered to match any funds Kapor offered the Foundation. John Gilmore, one of the pioneers of Sun Microsystems, immediately offered his own extensive financial and personal support. Gilmore, an ardent libertarian, was to prove an eloquent advocate of electronic privacy issues, especially freedom from governmental and corporate computer-assisted surveillance of private citizens.
Thinking Machines: The Inside Story of Artificial Intelligence and Our Race to Build the Future by Luke Dormehl
Ada Lovelace, agricultural Revolution, AI winter, Albert Einstein, Alexey Pajitnov wrote Tetris, algorithmic trading, Amazon Mechanical Turk, Apple II, artificial general intelligence, Automated Insights, autonomous vehicles, book scanning, borderless world, call centre, cellular automata, Claude Shannon: information theory, cloud computing, computer vision, correlation does not imply causation, crowdsourcing, drone strike, Elon Musk, Flash crash, friendly AI, game design, global village, Google X / Alphabet X, hive mind, industrial robot, information retrieval, Internet of things, iterative process, Jaron Lanier, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kickstarter, Kodak vs Instagram, Law of Accelerating Returns, life extension, Loebner Prize, Marc Andreessen, Mark Zuckerberg, Menlo Park, natural language processing, Norbert Wiener, out of africa, PageRank, pattern recognition, Ray Kurzweil, recommendation engine, remote working, RFID, self-driving car, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, social intelligence, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, technological singularity, The Coming Technological Singularity, The Future of Employment, Tim Cook: Apple, too big to fail, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!
Virtually none of it was achieved using Good Old-Fashioned AI. The company’s name, of course, was Google. CHAPTER 2 Another Way to Build AI IT IS 2014 and, in the Google-owned London offices of an AI company called DeepMind, a computer whiles away the hours by playing an old Atari 2600 video game called Breakout. The game was designed in the early 1970s by two young men named Steve Jobs and Steve Wozniak, who later went on to start a company called Apple. Breakout is essentially a variation on the bat-and-ball tennis game Pong, except that instead of hitting the square ‘ball’ across the screen to another player, you fire it at a wall of bricks which smash on impact. The goal is to destroy all of the bricks. As we saw in the previous chapter, there is nothing at all unusual about AI playing games.
He insisted on giving it spoken responses – which the original Siri app had not had – and got rid of the ability to type requests as well as just ask them, so as not to complicate the experience of using it. Apple also removed the bad language, and gave Siri the ability to pull information from Apple’s native iOS apps. Early Siri reviews were very positive when the iPhone 4s launched in 2011. Over time, however, cracks began to show. Embarrassingly, Apple co-founder Steve Wozniak – who left Apple decades earlier – was one vocal critic of the service, noting how Apple’s own-brand version seemed less intelligent than the original third-party Siri app. What had won him over about the first Siri, he said, was its ability to correctly answer the questions, ‘What are the five largest lakes in California?’ and ‘What are the prime numbers greater than eighty-seven?’ Now, questions about California’s five largest lakes brought up links to lakefront properties.
Epub ISBN: 9780753551653 Version 1.0 3 5 7 9 10 8 6 4 2 WH Allen, an imprint of Ebury Publishing, 20 Vauxhall Bridge Road, London SW1V 2SA WH Allen is part of the Penguin Random House group of companies whose addresses can be found at global.penguinrandomhouse.com Copyright © Luke Dormehl 2016 Cover design: Two Associates Luke Dormehl has asserted his right to be identified as the author of this Work in accordance with the Copyright, Designs and Patents Act 1988 First published by WH Allen in 2016 www.eburypublishing.co.uk A CIP catalogue record for this book is available from the British Library ISBN 9780753556740 Chapter 1 fn1 The answer, in case you want to prove yourself as smart as an AI, is 162. Chapter 3 fn1 Coffee, as it turns out, is a good starting point for a discussion about smart devices. Apple’s co-founder Steve Wozniak once said that he could never foresee a robot with enough general intelligence to walk into a strange house and make a cup of coffee. Exploring this hypothesis, some researchers now suggest the ‘coffee test’ as a potential measure for AGI, Artificial General Intelligence. I will discuss AGI later on in this book. Chapter 4 fn1 To be fair to Mitsuku, very few of us would have a good answer if this question were put to us.
So Good They Can't Ignore You: Why Skills Trump Passion in the Quest for Work You Love by Cal Newport
Apple II, bounce rate, business cycle, Byte Shop, Cal Newport, capital controls, cleantech, Community Supported Agriculture, deliberate practice, financial independence, follow your passion, Frank Gehry, information asymmetry, job satisfaction, job-hopping, knowledge worker, Mason jar, medical residency, new economy, passive income, Paul Terrell, popular electronics, renewable energy credits, Results Only Work Environment, Richard Bolles, Richard Feynman, rolodex, Sand Hill Road, side project, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, web application, winner-take-all economy
It’s at this point that Jobs’s ascent begins to accelerate. He takes on $250,000 in funding from Mark Markkula and works with Steve Wozniak to produce a new computer design that is unambiguously too good to be ignored. There were other engineers in the Bay Area’s Homebrew Computer Club culture who could match Jobs’s and Wozniak’s technical skill, but Jobs had the insight to take on investment and to focus this technical energy toward producing a complete product. The result was the Apple II, a machine that leaped ahead of the competition: It had color graphics; the monitor and keyboard were integrated inside the case; the architecture was open, allowing rapid expansion of memory and peripherals (such as the floppy disk, which the Apple II was the first to introduce into mainstream use). This was the product that put the company on the map and that pushed Jobs from a small-time entrepreneur into the head of a visionary company.
During this period, Jobs split his time between Atari and the All-One Farm, a country commune located north of San Francisco. At one point, he left his job at Atari for several months to make a mendicants’ spiritual journey through India, and on returning home he began to train seriously at the nearby Los Altos Zen Center. In 1974, after Jobs’s return from India, a local engineer and entrepreneur named Alex Kamradt started a computer time-sharing company dubbed Call-in Computer. Kamradt approached Steve Wozniak to design a terminal device he could sell to clients to use for accessing his central computer. Unlike Jobs, Wozniak was a true electronics whiz who was obsessed with technology and had studied it formally at college. On the flip side, however, Wozniak couldn’t stomach business, so he allowed Jobs, a longtime friend, to handle the details of the arrangement. All was going well until the fall of 1975, when Jobs left for the season to spend time at the All-One commune.
Among others, I introduced Apple founder Steve Jobs, radio host Ira Glass, and master surfboard shaper Al Merrick. Using this trio as our running example, I can now ask what it is specifically about these three careers that makes them so compelling? Here are the answers that I came up with: TRAITS THAT DEFINE GREAT WORK Creativity: Ira Glass, for example, is pushing the boundaries of radio, and winning armfuls of awards in the process. Impact: From the Apple II to the iPhone, Steve Jobs has changed the way we live our lives in the digital age. Control: No one tells Al Merrick when to wake up or what to wear. He’s not expected in an office from nine to five. Instead, his Channel Island Surfboards factory is located a block from the Santa Barbara beach, where Merrick still regularly spends time surfing. ( Jake Burton Carpenter, founder of Burton Snowboards, for example, recalls how negotiations for the merger between the two companies happened while he and Merrick waited for waves in a surf lineup.)
Intertwingled: The Work and Influence of Ted Nelson (History of Computing) by Douglas R. Dechow
3D printing, Apple II, Bill Duvall, Brewster Kahle, Buckminster Fuller, Claude Shannon: information theory, cognitive dissonance, computer age, conceptual framework, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Snowden, game design, HyperCard, hypertext link, information retrieval, Internet Archive, Jaron Lanier, knowledge worker, linked data, Marc Andreessen, Marshall McLuhan, Menlo Park, Mother of all demos, pre–internet, RAND corporation, semantic web, Silicon Valley, software studies, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, the medium is the message, Vannevar Bush, Wall-E, Whole Earth Catalog
Technical compromises made in the early days of the World Wide Web undermined Ted’s ability to implement hypertext on a large scale. He continues to rail at this constraint. Forty years after Computer Lib, computers are far more sophisticated and the networks among digital objects are much richer and more complex. It is time to revisit fundamental assumptions of networked computing, such as the directionality of links, a point made by multiple speakers at the symposium—Wendy Hall, Jaron Lanier, Steve Wozniak, and Rob Akcsyn amongst them.1 Fig. 10.3Ordinary hypertext, with multi-directional links. From Literary Machines (Used with permission) 10.2.3 Managing Research Data Managing research data is similarly a problem of defining and maintaining relationships amongst multi-media objects. Research data do not stand alone. They are complex objects that can be understood only in relation to their context, which often includes software, protocols, documentation, and other entities scattered over time and space .
Wardrip-Fruin N, Montfort N (eds) (2003) The new media reader. MIT Press, Cambridge, MA 17. Wing JM (2006) Computational thinking. Commun ACM 49(3) 18. Wozniak S (2014) In “Intertwingled: afternoon session #2.” Chapman University, Orange, California. Video timecode: 58:14. http://ibc.chapman.edu/Mediasite/Play/52694e57c4b546f0ba8814ec5d9223ae1d Footnotes 1For example, as Steve Wozniak said at Intertwingled, “At our computer club, the bible was Computer Lib” — referring to the Homebrew Computer Club, from which Apple Computer and other major elements of the turn to personal computers emerged . 2“Computational thinking is the process of recognising aspects of computation in the world that surrounds us, and applying tools and techniques from Computer Science to understand and reason about both natural and artificial systems and processes” . 3“Computational Media” has recently emerged as a name for the type of work that performs this interdisciplinary integration . 4Kodu is both an influential system itself and the basis of Microsoft’s Project Spark, launched in October 2014. 5The first stage of our work is described in “Say it With Systems” .
From the age of about three or four I was particularly fascinated by “exclusive or” light switches, where you have a room with the need for switches at two different doors and so they are wired up in such a way that both switches control the light and you can turn it on or off from either door. As a child I then went on to explore in sequence: electricity, electronics, digital electronics and early computers. We had ancient computers at my school. We had a PDP-8 and then an LSI-11 and an Apple II and so on up through the history of computers. I was interested in each level of hardware: how the physics of transistors worked, how digital circuits were put together, and how CPUs operated. When I was young, I designed a simple CPU and a simple operating system. I asked my brother to sit underneath a desk, fed him instructions, and had him execute them. In parallel with that interest, I have also always been interested in culture, both national cultures and popular culture.
The Four: How Amazon, Apple, Facebook, and Google Divided and Conquered the World by Scott Galloway
activist fund / activist shareholder / activist investor, additive manufacturing, Affordable Care Act / Obamacare, Airbnb, Amazon Web Services, Apple II, autonomous vehicles, barriers to entry, Ben Horowitz, Bernie Sanders, big-box store, Bob Noyce, Brewster Kahle, business intelligence, California gold rush, cloud computing, commoditize, cuban missile crisis, David Brooks, disintermediation, don't be evil, Donald Trump, Elon Musk, follow your passion, future of journalism, future of work, global supply chain, Google Earth, Google Glasses, Google X / Alphabet X, Internet Archive, invisible hand, Jeff Bezos, Jony Ive, Khan Academy, longitudinal study, Lyft, Mark Zuckerberg, meta analysis, meta-analysis, Network effects, new economy, obamacare, Oculus Rift, offshore financial centre, passive income, Peter Thiel, profit motive, race to the bottom, RAND corporation, ride hailing / ride sharing, risk tolerance, Robert Mercer, Robert Shiller, Robert Shiller, Search for Extraterrestrial Intelligence, self-driving car, sentiment analysis, shareholder value, Silicon Valley, Snapchat, software is eating the world, speech recognition, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Stewart Brand, supercomputer in your pocket, Tesla Model S, Tim Cook: Apple, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, undersea cable, Whole Earth Catalog, winner-take-all economy, working poor, young professional
As a luxury brand, Apple is the first technology company to have a shot at multigenerational success. Apple did not start as a luxury brand. It was the best house in a shitty neighborhood, tech hardware. A world of cables, geekware, acronyms, and low margins. In the early days, Apple simply made a more intuitive computer than its competitors. Steve Jobs’s notions about elegant packaging only appealed to a minority of customers; it was Steve Wozniak’s architecture that drew the rest. Back then, the company appealed largely to consumers’ brains. Many early Apple lovers were geeks (which did nothing for its sex appeal). Apple, to its credit, gazed across the tracks at luxury town and thought: Why not? Why can’t we be the best house in the best neighborhood? In the 1980s, the company declined. Machines running Microsoft Windows with Intel chips were faster and cheaper and began to win over the rational organ (the brain).
But who you really are has become what you text on. The Builder King You would be amazed at how many people still believe, against all evidence, that Steve Jobs actually invented all of Apple’s great products. As if he sat at a lab table in the R&D department at Apple headquarters in Cupertino and soldered chips on a tiny motherboard . . . until boom! he gave the world the iPod. Actually, that was Steve Wozniak with the Apple 1 a quarter century before. Steve Jobs was a genius—but his gifts lay elsewhere. And nowhere was that genius more visible than when business experts everywhere were proclaiming the “disintermediation” of tech—the disappearance of the physical distribution and retail channels as they were replaced by the virtualization of e-commerce. Jobs understood, as none of his peers did, that whereas content, even commodity products, might be sold online, if you wanted to sell electronics hardware as premium-priced luxury items, you had to sell them like other luxury items.
The Cupertino firm controls 14.5 percent of the smartphone market, but captures 79 percent of global smartphone profits (2016).11 Steve Jobs instinctively understood this. Attendees at the 1977 Western Computer Conference in San Francisco registered the difference the instant they walked into Brooks Hall: while all other new personal computer companies were offering stripped-out motherboards or ugly metal boxes, Jobs and Woz sat at their table behind the tan injected-plastic Apple II computers that would define the elegant Apple look. The Apple computers were beautiful; they were elegant. Most of all, in a world of hackers and gearheads, Apple’s products bespoke luxury. Luxury is not an externality; it’s in our genes. It combines our instinctive need to transcend the human condition and feel closer to divine perfection, with our desire to be more attractive to potential mates.
Game Over Press Start to Continue by David Sheff, Andy Eddy
affirmative action, air freight, Alexey Pajitnov wrote Tetris, Apple II, Apple's 1984 Super Bowl advert, Buckminster Fuller, game design, HyperCard, inventory management, James Watt: steam engine, Jaron Lanier, Marshall McLuhan, Mikhail Gorbachev, pattern recognition, profit motive, revision control, Ronald Reagan, Silicon Valley, Steve Jobs, Steve Wozniak
“I made it with my own two hands and a soldering iron,” Bushnell says. He named it “Pong,” after the sonar-like “pongs” that sounded each time the ball made contact with the paddle. In the fall of 1972, Bushnell placed “Pong,” the first commercial video-arcade game, with a coin box bolted to the outside, in Andy Capp’s tavern, a popular Sunnyvale pool bar that holds a place in Silicon Valley lore rivaled only by the garage in which Steve Jobs and Steve Wozniak invented the Apple computer. Set beside a pinball machine, “Pong” was an oddity, a dark wood cabinet that held a black-and-white TV screen on which cavorted a white blip like a shooting star in a black sky. One of the bar’s patrons stood over the machine, examining it. “Avoid missing ball for high score,” read the only line of instructions. The young man reached into his pocket, extracted a quarter, and slipped it into a slot on the console as he called a friend over.
The company literally couldn’t afford the payroll twice one month. Don Valentine’s money had helped build up production, but the returns lagged. A big success that followed “Pong” bailed them out. It was the first video car-racing game that was controlled by a steering wheel attached to the cabinet. The game, “Gran Trak,” gobbled up quarters even faster than “Pong.” A friend of Steve Jobs, Steve Wozniak, an engineer at Hewlett Packard, was a “Gran Trak” addict. Most evenings after work he headed to a pub, where he put great quantities of quarters, money he could not afford, into “Gran Trak.” Jobs began to sneak him into Atari’s production facility at night, where he could play the game for free. In exchange for the free-game time, Woz, a whiz with computers, helped out whenever Jobs hit a stumbling block with some particularly tricky circuitry.
As in the original game, each player controlled a tank that tried to seek out and destroy the other. When a successful hit was made, the on-screen tank exploded and the player controlling the disabled vehicle got an electric shock. “We did it so it wasn’t lethal or anything,” Bushnell notes. “But all of a sudden it was real. Certain people really liked it.” The company’s legal department, however, was not among them, and the game never made it out the door. Steve Wozniak came over to Atari to help Jobs build another “Pong”-based game for Bushnell called “Breakout.” A paddle hit a ball against a wall of bricks that disappeared, one by one, when hit, until there were none left. Bushnell liked the game, but the circuitry required too many expensive computer chips. He offered Jobs a bonus of $100 for every chip he was able to eliminate. Jobs made himself $5,000.
Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do by Jeremy Bailenson
Apple II, augmented reality, computer vision, deliberate practice, experimental subject, game design, Google Glasses, income inequality, Intergovernmental Panel on Climate Change (IPCC), iterative process, Jaron Lanier, low earth orbit, Mark Zuckerberg, Marshall McLuhan, meta analysis, meta-analysis, Milgram experiment, nuclear winter, Oculus Rift, randomized controlled trial, Silicon Valley, Skype, Snapchat, Steve Jobs, Steve Wozniak, Steven Pinker, telepresence, too big to fail
I was reminded of this while giving a talk at a technology conference in 2016 alongside Steve Wozniak, cofounder of Apple. Woz is high on VR—his first HTC Vive experience gave him goose bumps. But he cautions about overspecifying use cases. He told the story of the early days at Apple, and how when he and Steve Jobs made the Apple II, they conceived of it as a home appliance for computer enthusiasts, and believed users would use it to play games, or to store and access recipes in the kitchen. But it turned out it was good for unexpected applications. Sales really took off when a spreadsheet program was developed and suddenly people could do office work from home. According to Wozniak, he and Jobs were wrong about exactly what the Apple II would be used for. They knew they’d created something revolutionary, but they were mistaken in what that revolution meant.
An industry is already growing in Hollywood and Silicon Valley to explore VR as a space for fictional narratives, and within it storytellers from Hollywood and the gaming world, with technical help from technology companies, are beginning to take the tentative early steps in defining the grammar of virtual storytelling. Brett Leonard was a young filmmaker fresh from his hometown of Toledo, Ohio when he landed in Santa Cruz just before the beginning of the first VR boom in 1979. There he fell in with future Silicon Valley icons like Steve Wozniak, Steve Jobs, and Jaron Lanier. Jaron is also one of our most incisive and visionary thinkers about technology and its effects on human commerce and culture; at the time, as well as now, he was the very public face of virtual reality, a term he coined and popularized. Leonard was a big fan of technology and science fiction when his trip to Silicon Valley dropped him into what must have seemed the most interesting place in the world, amidst a group of people who were already playing a significant role in shaping the future.
At the time of the purchase, Oculus, founded by a 21-year-old self-taught engineer who had been mentored by the genius HMD maker Mark Bolas, had already reignited interest in VR among techies and gamers a few years earlier by making a lightweight and effective HMD prototype, the Oculus Rift, jury-rigged with smartphone screens and some clever programming. “I’ve seen a handful of technology demos in my life that made me feel like I was glimpsing into the future,” wrote Chris Dixon, an investor at the influential Silicon Valley venture capital firm Andreesen-Horowitz. “Apple II, the Macintosh, Netscape, Google, the iPhone, and—most recently—the Oculus Rift.”1 While the performance of this new consumer VR equipment was not quite as good as that of the state-of-the-art hardware in labs like mine, it was good enough to avoid the major performance problem that had bedeviled previous attempts at consumer VR—nausea-inducing lag. And, perhaps more important to the goal of making VR a viable consumer medium, the Rift could be manufactured for around $300—considerably cheaper than the $30,000 state-of-the-art HMD used at labs like VHIL.
Dogfight: How Apple and Google Went to War and Started a Revolution by Fred Vogelstein
Apple II, Ben Horowitz, cloud computing, commoditize, disintermediation, don't be evil, Dynabook, Firefox, Google Chrome, Google Glasses, Googley, John Markoff, Jony Ive, Marc Andreessen, Mark Zuckerberg, Peter Thiel, pre–internet, Silicon Valley, Silicon Valley startup, Skype, software patent, spectrum auction, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Tim Cook: Apple, web application, zero-sum game
Head of Global Marketing Phil Schiller went to Chicago. Jony Ive and his design crew went to San Francisco. Steve Jobs’s store was, naturally, the one in downtown Palo Alto at the corner of University Avenue and Kipling Street. It was a mile and a half from his house and he often showed up there unannounced when he was in town. The appropriate high-tech luminaries had already gathered when he arrived. Apple cofounder Steve Wozniak and early Apple employees Bill Atkinson and Andy Hertzfeld were already standing on line. But it also seemed as if Jobs had some internal flames to fan of his own, said one of the engineers who was there along with Grignon and many others who had worked on the project, including Fadell and Forstall. “So there’s this reunion of the original Mac guys, and it’s really cool. And then Steve goes up to Tony [Fadell] and proceeds to go over in a corner of the store and talk to him for an hour and ignore Forstall just to fuck with him.”
But many others decided they now needed only two, and they started ditching their Microsoft-run Dell, HP, Toshiba, Acer, and Lenovo laptops at an accelerating clip. The shift hit Dell so hard that by the beginning of 2013 it was trying to take itself private to retrench. Jobs was particularly satisfied with this development, a confidant said—even though in the context of the other upheavals the iPad was unleashing it was almost a footnote. Thirty-five years after starting Apple with Steve Wozniak, Jobs was finally doing what he had set out to do all along: he was transforming what consumers and businesses expected from their computers. The Macintosh in 1984—the first mainstream machine to use a mouse—was supposed to have been the machine that did this. It was supposed to have taken a complicated device—the PC—and made it a consumer product that anyone could use. That failed. As everyone knows, Macs didn’t go away, but Microsoft Windows and Office get the credit for making the PC mainstream.
Fadell was getting ready to start his own company when Apple’s head of hardware, Jon Rubinstein, called, trying to recruit Fadell for a job that, astonishingly, he was not allowed to disclose. According to Steven Levy’s book The Perfect Thing, Fadell took the call on a ski slope in Colorado in January and expressed interest on the spot. He had idolized Apple since he was twelve, according to Levy. That was when he’d spent the summer of ’81 caddying to save up enough money to buy an Apple II. Weeks after Rubinstein’s call, Fadell joined Apple, only discovering then that he was being hired as a consultant to help build the first iPod. Grignon and others have said that Fadell’s rise never sat well with Forstall. Up until Fadell joined Apple, Jobs’s inner circle was composed of people he’d worked closely with at least from the beginning of his return in 1997, and in some cases from his days running NeXT, the computer company he’d founded after getting fired from Apple in 1985.
Jony Ive: The Genius Behind Apple's Greatest Products by Leander Kahney
Apple II, banking crisis, British Empire, Chuck Templeton: OpenTable:, Computer Numeric Control, Dynabook, global supply chain, interchangeable parts, John Markoff, Jony Ive, Kickstarter, race to the bottom, RFID, side project, Silicon Valley, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, the built environment, thinkpad, Tim Cook: Apple
It was more than four years since Brunner had written his conceptual brief. With the much-anticipated twentieth anniversary of the Macintosh approaching, the decision was made to designate Spartacus as a special edition. Officially named the Twentieth Anniversary Macintosh, the new product was limited to a run of just twenty thousnd units. Apple unveiled it at Macworld in January 1997 and the first two units were given to Steve Jobs and Steve Wozniak, who had just returned to the company as advisers. To make it more memorable, the machine was hand-delivered to customers’ homes by specially trained “concierges,” who set up the machines, installed any expansion cards (along with the ugly hunchback) and showed users how to use them. “I think it is the first sensible computer design that we have seen in a long time,” said Henry Steiner, Hong Kong’s most eminent graphic designer.
“I think it is the first sensible computer design that we have seen in a long time,” said Henry Steiner, Hong Kong’s most eminent graphic designer. “It is quite beautiful and desirable. It has the status value of a Porsche. The fact that the machine combines computer, television and stereo system is impressive.” Like the MessagePad, the Twentieth Anniversary Macintosh (TAM) won not only kudos but awards, including the Best of Category prize for I.D. magazine’s Annual Design Review. Steve Wozniak thought it was the perfect college machine “with the computer, TV, radio, CD player and more (AV even) all in one sleek machine.” He had several at his mansion in the hills of Los Gatos above Silicon Valley. By the time the machine was pulled from the market one year after launch, however, Wozniak seemed to be the only person on the planet who liked it. The TAM bombed in the marketplace.
In addition, it shouldn’t take up too much space on a desk, so Jobs and his design team decided it should have an unusual vertical orientation, with the disk drive below the monitor, instead of to the side like other machines at the time. The design process continued for several months, with a sequence of prototypes and endless discussions. Material evaluations led to the use of tough ABS plastic that was used to make LEGO bricks, which would give the new machine a fine, scratch-resistant texture. Troubled by the way earlier Apple IIs had turned orange in sunlight over time, Manock decided to make the Macintosh beige, initiating a trend that would last twenty years. As Jony would do in the next generation at Apple, Jobs paid close attention to every detail. Even the mouse was designed to reflect the shape of the computer, with the same proportions, and a single square button that corresponded to the shape and placement of the screen.
All Your Base Are Belong to Us: How Fifty Years of Video Games Conquered Pop Culture by Harold Goldberg
activist lawyer, Alexey Pajitnov wrote Tetris, Apple II, cellular automata, Columbine, Conway's Game of Life, G4S, game design, In Cold Blood by Truman Capote, Mars Rover, Mikhail Gorbachev, Ralph Waldo Emerson, Ray Oldenburg, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, The Great Good Place, Thorstein Veblen, urban planning
Tobey was a boy genius, a brilliant but occasionally arrogant artistic phenomenon who was working on computer games while still in high school, toiling at babysitting jobs to pay for his $800 Commodore Pet 2001. Tobey spent most of his time at the computer trying to make a game that was as close to real life as a computer in the 1980s could make it. Through word of mouth, Tobey’s flying and shooting game based on F-15 fighter jets came to the attention of Apple’s Steve Wozniak when Tobey was just sixteen. Wozniak was wowed at the sound, graphics, and game play. He kept saying, “This can’t be done on the Apple II. I can’t believe it. This can’t be done.” He gave Tobey a calling card and added a note to Trip Hawkins, which read, “Please consider this flight simulator as the finest Apple game ever done.” Hawkins didn’t waste any time. He wanted to make a deal right away. Tobey’s parents came with him to EA’s offices to oversee a lucrative royalty deal for Skyfox, a game that would eventually sell more than a million copies.
From plastic dust they were born and to plastic dust and desert sand they returned. In 1975 that plastic hadn’t been worthless at all. It was precious gold to the principals of Atari, and it would only become more valuable as the decade progressed. Atari’s arcade business was still thriving, and Home Pong exceeded sales expectations, and demand exceeded supply. Alcorn hired an unkempt and unshaven Steve Jobs, who in turn asked his best friend, the diffident genius Steve Wozniak, for help with what would be one of Atari’s most popular additions to its ever expanding library. Without telling Alcorn, Bushnell asked Jobs to help him streamline the innards of a brick-breaking arcade game called Breakout. Bushnell wanted to save money because the chips used in each arcade machine were still pricey at the time. He coaxed the brazen, odoriferous Jobs with $750 and a $100 bonus for each chip removed from the prototype.
The graphics and play in the inaugural Madden effort that was finally released for the Apple II in June 1989 were like caveman drawings when measured by today’s standards. The art looked like a cheap cartoon. Only sixteen of the NFL’s twenty-eight teams were represented. While the real players were there, the teams’ logos weren’t. And while the stats for each player were carefully honed for realism’s sake, every player looked the same. On the cover of the first game, the smiling Madden, holding a football running back–style, looks as much surprised as he is happy. It’s as if he’s about to say, “Gee, I know football, but what’s this videogame thing all about?” Nonetheless, the gaming world went wild over the game. Nibble, an Apple II enthusiast magazine of the time, detailed the many functions of the game and highlighted the news that you could make your own plays “if you’re really serious about football.”
The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley by Leslie Berlin
Apple II, Bob Noyce, business cycle, collective bargaining, computer age, George Gilder, informal economy, John Markoff, Kickstarter, laissez-faire capitalism, low skilled workers, means of production, Menlo Park, Murray Gell-Mann, open economy, Richard Feynman, Ronald Reagan, Sand Hill Road, Silicon Valley, Silicon Valley startup, Steve Jobs, Steve Wozniak, union organizing, War on Poverty, women in the workforce, Yom Kippur War
This certainly was true of Apple Computer, which was financed by men associated with Fairchild and Intel and staffed with many people from Hewlett-Packard and Intel.46 Apple had gotten its start in 1976, when 19-year-old Jobs convinced his friend Steve Wozniak, who had developed a personal computer in his garage, to start a business with him. The two showed their computer to venture capitalist Don Valentine (a former Fairchild salesman), who suggested they contact Mike Markkula, recently retired (at age 34) from his job in Intel’s marketing group. Markkula, who had long dreamed of something like a personal computer—as a teenager, he had built a “programmable electronic sliderule”—invested $91,000 in the company. In exchange, he received a one-third ownership stake in Apple.47 One of Markkula’s first calls on behalf of Apple was to Noyce. “I want you to be aware of this,” Markkula said. “I’d like to present to the [Intel] board.” Noyce gave his approval and on the appointed day, Markkula and Steve Wozniak gave a presentation about the personal computer, an Apple II on hand for demonstration purposes.
But he was interesting enough to talk to, and soon Bowers found herself engrossed in what she called “all Steve’s schemes,” only half of which she thought were even remotely feasible. Clearly this was a company that needed her help. She agreed to consult for Apple.49 A few months into her consulting work, Bowers learned that Steve Wozniak wanted to sell some of his founders’ stock for $13 a share. She bought it from him. “Bob thought I was nuts,” she recalls. Noyce did not try to stop her from investing—they had long ago agreed that she could do what she liked with her money, and he could do the same with his—but he could not take Jobs and Wozniak seriously. Even Arthur Rock admits, “Steve Jobs and Steve Wozniak weren’t very appealing people in those days.” Wozniak was the telephone-era’s version of a hacker—he used a small box that emitted electronic tones to call around the world for free—and Steve Jobs’s ungroomed appearance was offputting to Noyce.
Noyce gave his approval and on the appointed day, Markkula and Steve Wozniak gave a presentation about the personal computer, an Apple II on hand for demonstration purposes. “If you want to participate in this in some way, say so,” Markkula told the board. “If you don’t, fine. But this Renewal 251 is something you should have in front of your consciousness.” Intel had not given much thought to the personal computer since Moore squelched Noyce and Gelbach’s plans to go head to head with Altair. The board listened politely and asked a few questions, but no one proposed a relationship between Intel and Apple that went beyond Intel’s possibly providing the microprocessor for Apple Computers. “Nothing else was really in Intel’s best interest,” Markkula acknowledges.48 But Arthur Rock had paid careful attention to Markkula and Wozniak’s presentation. A few days later, he called Markkula’s office.
Kingpin: How One Hacker Took Over the Billion-Dollar Cybercrime Underground by Kevin Poulsen
Apple II, Brian Krebs, Burning Man, corporate governance, dumpster diving, Exxon Valdez, Hacker Ethic, hive mind, index card, Kickstarter, McMansion, Mercator projection, offshore financial centre, packet switching, pirate software, Ponzi scheme, Robert Hanssen: Double agent, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, traffic fines, web application, WikiLeaks, zero day, Zipcar
But hacking was above all a creative effort, one that would lead to countless watershed moments in computer history. The word “hacker” took on darker connotations in the early 1980s, when the first home computers—the Commodore 64s, the TRS-80s, the Apples—came to teenagers’ bedrooms in suburbs and cities around the United States. The machines themselves were a product of hacker culture; the Apple II, and with it the entire home computer concept, was born of two Berkeley phone phreaks named Steve Wozniak and Steve Jobs. But not all teenagers were content with the machines, and in the impatience of youth, they weren’t inclined to wait for grad school to dip into real processing power or to explore the global networks that could be reached with a phone call and the squeal of a modem. So they began illicit forays into corporate, government, and academic systems and took their first tentative steps into the ARPANET, the Internet’s forerunner.
See http://www.securityfocus.com/comments/articles/203/5729/threaded (May 24, 2001). Max says he did not consider himself an informant and only provided technical information. Chapter 4: The White Hat 1 The first people to identify themselves as hackers: The seminal work on the early hackers is Steven Levy, Hackers: Heroes of the Computer Revolution (New York: Anchor Press/Doubleday, 1984). Also see Steve Wozniak and Gina Smith, iWoz: From Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It (New York: W. W. Norton and Company, 2006). 2 Tim was at work one day: This anecdote was recalled by Tim Spencer. Max later recalled Spencer’s advice in a letter to his sentencing judge in Pittsburgh. 3 If there was one thing Max: Details of Max’s relationship with Kimi come primarily from interviews with Kimi. 4 Max went up to the city to visit Matt Harrigan: Harrigan’s business and his work with Max were described primarily by Harrigan, with some details confirmed by Max.
Lab Rats: How Silicon Valley Made Work Miserable for the Rest of Us by Dan Lyons
Airbnb, Amazon Web Services, Apple II, augmented reality, autonomous vehicles, basic income, bitcoin, blockchain, business process, call centre, Clayton Christensen, clean water, collective bargaining, corporate governance, corporate social responsibility, creative destruction, cryptocurrency, David Heinemeier Hansson, Donald Trump, Elon Musk, Ethereum, ethereum blockchain, full employment, future of work, gig economy, Gordon Gekko, greed is good, hiring and firing, housing crisis, income inequality, informal economy, Jeff Bezos, job automation, job satisfaction, job-hopping, John Gruber, Joseph Schumpeter, Kevin Kelly, knowledge worker, Lean Startup, loose coupling, Lyft, Marc Andreessen, Mark Zuckerberg, McMansion, Menlo Park, Milgram experiment, minimum viable product, Mitch Kapor, move fast and break things, move fast and break things, new economy, Panopticon Jeremy Bentham, Paul Graham, paypal mafia, Peter Thiel, plutocrats, Plutocrats, precariat, RAND corporation, remote working, RFID, ride hailing / ride sharing, Ronald Reagan, Rubik’s Cube, Ruby on Rails, Sam Altman, Sand Hill Road, self-driving car, shareholder value, Silicon Valley, Silicon Valley startup, six sigma, Skype, Social Responsibility of Business Is to Increase Its Profits, software is eating the world, Stanford prison experiment, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, TaskRabbit, telemarketer, Tesla Model S, Thomas Davenport, Tony Hsieh, Toyota Production System, traveling salesman, Travis Kalanick, tulip mania, Uber and Lyft, Uber for X, uber lyft, universal basic income, web application, Whole Earth Catalog, Y Combinator, young professional
All employees got bonuses and participated in profit sharing. Bosses engaged in “management by wandering around,” a practice that struck Tom Peters when he studied HP and raved about the company’s way of doing things in his 1982 business classic, In Search of Excellence. By the 1970s, HP was a thriving organization that many in Silicon Valley (and beyond) wanted to emulate. Apple co-founder Steve Wozniak, who worked as an engineer at HP in the 1970s, later recalled: “We had such great camaraderie. We were so happy. Almost everyone spoke about it as the greatest company you could ever work for.” The 1970s brought another element to Silicon Valley—the idealistic values of the counterculture. “Power to the people” was the slogan of 1960s, and it was also the motto of the people who led the personal computer revolution in the 1970s.
Amazon Prime is an amazing service, but Amazon abuses workers in its headquarters and warehouses. Customers love Uber, but Uber operates a toxic workplace and exploits its drivers. Tesla makes very sexy electric cars, but by many accounts, Elon Musk behaves abominably toward his employees and has earned a reputation for being less than forthcoming with customers. “I don’t believe anything Elon Musk or Tesla says,” Apple co-founder Steve Wozniak, a disappointed Tesla owner, said in 2018. In the past few years I’ve come to the uncomfortable conclusion that, for various reasons mostly related to greed, the very people in Silicon Valley who talk so much about making the world a better place are actually making it worse—at least when it comes to the well-being of workers. It would be nice to believe that this is happening because these tech geniuses are a little bit Aspergery and lack the social skills needed to manage people effectively—that is, that they’re well-meaning nerds who are clueless about their fellow human beings.
Breaking the Code of Silence The Kapors have been activists, in one way or another, since the early 1970s, when Mitch was at Yale and Freada was at UC-Berkeley. They might have become just another kooky old hippie couple living in the Bay Area except that in the early 1980s Mitch became fantastically rich. This happened almost by accident. After graduation in 1971 he spent a decade bouncing around. He taught Transcendental Meditation. He worked as a DJ. In 1978 he bought an Apple II computer and taught himself to write programs, which landed him a job at VisiCorp, a tiny software developer near Boston. In 1982, Mitch founded Lotus Development, named after the lotus position used in meditation, to sell a software program called Lotus 1-2-3, a spreadsheet that ran on the recently introduced IBM personal computer. Kapor expected Lotus would generate $1 million in sales in its first year.
Loonshots: How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries by Safi Bahcall
accounting loophole / creative accounting, Albert Einstein, Apple II, Apple's 1984 Super Bowl advert, Astronomia nova, British Empire, Cass Sunstein, Charles Lindbergh, Clayton Christensen, cognitive bias, creative destruction, disruptive innovation, diversified portfolio, double helix, Douglas Engelbart, Douglas Engelbart, Edmond Halley, Gary Taubes, hypertext link, invisible hand, Isaac Newton, Johannes Kepler, Jony Ive, knowledge economy, lone genius, Louis Pasteur, Mark Zuckerberg, Menlo Park, Mother of all demos, Murray Gell-Mann, PageRank, Peter Thiel, Philip Mirowski, Pierre-Simon Laplace, prediction markets, pre–internet, Ralph Waldo Emerson, RAND corporation, random walk, Richard Feynman, Richard Thaler, side project, Silicon Valley, six sigma, Solar eclipse in 1919, stem cell, Steve Jobs, Steve Wozniak, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Tim Cook: Apple, tulip mania, Wall-E, wikimedia commons, yield management
The less-famous history of an ultra-famous icon captures one person’s evolution toward this balance. During Steve Jobs’s first stint at Apple, he called his loonshot group working on the Mac “pirates” or “artists” (he saw himself, of course, as the ultimate pirate-artist). Jobs dismissed the group working on the Apple II franchise as “regular Navy.” The hostility he created between the two groups, by lionizing the artists and belittling the soldiers, was so great that the street between their two buildings was known as the DMZ—the demilitarized zone. The hostility undermined both products. Steve Wozniak, Apple’s cofounder along with Jobs, who was working on the Apple II franchise, left, along with other critical employees; the Mac launch failed commercially; Apple faced severe financial pressure; Jobs was exiled; and John Sculley took over (eventually rescuing the Mac and restoring financial stability).
By early 1993, nearly all the vice presidents at the company, including all five of Jobs’s original cofounders, had left. A Forbes article stated, “There are very few miracle workers in the business world, and it is now clear that Steve Jobs is not one of them.” WHEN MOSES DOUBLES DOWN The facts of Jobs’s forced exit from Apple in 1985, and his path to the mess at NeXT, have been well laid out. In 1975, Steve Wozniak combined a microprocessor, keyboard, and screen into one of the earliest personal computers. Jobs convinced Wozniak to quit his job and start a company. After some initial success with their Apple I and II, however, competitors quickly passed Apple by. In 1980, Atari and Radio Shack (TRS-80) sold roughly seven times as many computers as Apple. By 1983, Commodore dominated the market, with the IBM PC, launched only two years earlier, a close second.
Loving your loonshot and franchise groups equally, however, requires overcoming natural preferences. Artists tend to favor artists. Soldiers tend to favor soldiers. Jobs proudly and publicly referred to his team, working on the Macintosh, as artists. He referred to the rest of the company, developing the Apple II franchise, as bozos. Apple II engineers took to wearing buttons with a circle and line running through an image of Bozo the Clown. Wozniak, an engineer with the demeanor of a teddy bear, was widely beloved at the company and in the industry. He resigned, openly complaining about the demoralizing attacks. Departures in the Apple II group became so common that one joke ran, “If your boss calls, be sure to get his name.” The toxicity spread. Key designers on the Macintosh side soon began leaving as well. It didn’t take long for the Apple Board of Directors and its recently hired CEO, John Sculley, to conclude the dysfunction was not sustainable.
The Simulation Hypothesis by Rizwan Virk
3D printing, Albert Einstein, Apple II, artificial general intelligence, augmented reality, Benoit Mandelbrot, bioinformatics, butterfly effect, discovery of DNA, Dmitri Mendeleev, Elon Musk, en.wikipedia.org, Ernest Rutherford, game design, Google Glasses, Isaac Newton, John von Neumann, Kickstarter, mandelbrot fractal, Marc Andreessen, Minecraft, natural language processing, Pierre-Simon Laplace, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, Silicon Valley, Stephen Hawking, Steve Jobs, Steve Wozniak, technological singularity, Turing test, Vernor Vinge, Zeno's paradox
Nevertheless, the video game pioneers of that time persisted, squeezing every bit of performance out of the limited hardware and memory of the day to create these early arcade games. A well-known anecdote from Silicon Valley at the time involves future Apple Computer co-founders Steve Jobs and Steve Wozniak. Jobs worked for Nolan Bushnell, the founder of Atari, and promised his boss that he could build a certain game quickly and using limited memory resources. Bushnell was skeptical but gave him the project. At night, Jobs brought in his friend, Steve Wozniak, who, created the game at night after his full-time engineering job. Wozniak, of course, as the future creator of the first Apple computers, is acknowledged today as a hardware genius. In some ways, the history of video games is the history of optimizing very limited resources.
Beyond the bleachers, there was a sky with clouds and a city or country landscape that was only partly visible. I found myself wondering how far this “simulated world” extended in all directions beyond the track. What happened when no one was playing the video game? Did the characters and the buildings still exist, or did they simply cease to exist? Although I learned to program rudimentary video games myself shortly thereafter when my parents bought my brother and I a Commodore 64 (and later, an Apple II), it would be many years before I understood video game development well enough to answer these types of questions. The first game I ever created was Tic Tac Toe, basically putting blocky lines on the screen and then figuring out how to get the computer to “draw” Xs and Os on the squares selected by the players. My brother and I would play each other, but after he got bored I figured I could play against the computer.
Alpha Girls: The Women Upstarts Who Took on Silicon Valley's Male Culture and Made the Deals of a Lifetime by Julian Guthrie
Airbnb, Apple II, barriers to entry, blockchain, Bob Noyce, call centre, cloud computing, credit crunch, disruptive innovation, Elon Musk, equal pay for equal work, fear of failure, game design, glass ceiling, hiring and firing, Jeff Bezos, Louis Pasteur, Lyft, Mark Zuckerberg, Menlo Park, Mitch Kapor, new economy, PageRank, peer-to-peer, pets.com, phenotype, place-making, Ronald Reagan, Rosa Parks, Sand Hill Road, Silicon Valley, Silicon Valley startup, Skype, Snapchat, software as a service, South of Market, San Francisco, stealth mode startup, Steve Jobs, Steve Wozniak, TaskRabbit, Tim Cook: Apple, Travis Kalanick, uber lyft, unpaid internship, upwardly mobile, urban decay, web application, William Shockley: the traitorous eight, women in the workforce
It was ruled by men: Samuel Brannan, Levi Strauss, John Studebaker, Henry Wells, and William Fargo. Women, outnumbered and overmatched, were mostly reduced to entertainers, companions, wives, or housekeepers. Things were not that different in the more recent gold rush. The Valley was always a region dominated by men, from William Hewlett, Dave Packard, Bob Noyce, Gordon Moore, Andy Grove, Larry Ellison, Steve Jobs, and Steve Wozniak to, decades later, in the twenty-first century, Larry Page, Sergey Brin, Mark Zuckerberg, Elon Musk, Tim Cook, Travis Kalanick, and Marc Benioff. Mary Jane, fueled by peanut butter sandwiches packed in wax paper for the two-day journey, was under no illusion that it would be easy to navigate the old boys’ club of Sand Hill Road and Silicon Valley. Even today, decades after Mary Jane first arrived, 94 percent of investing partners at venture capital firms—the financial decision makers shaping the future—are men, and more than 80 percent of venture firms have never had a woman investing partner.
She believed with every fiber of her being that e-commerce was the wave of the future. * * * Her career after graduating from Stanford, more than a decade before, had started off so promising, and then she took a detour. Even before she finished her master’s degree, Magdalena had gone to seven job interviews and received seven offers. Her first interview had been with Steve Jobs and Steve Wozniak of Apple. The founders invited double-e students to the LOTS computer center to hear a pitch about their three-year-old company. If the students liked what they heard, they could stay and be interviewed. Jobs, wearing wire-rimmed glasses and jeans, told Magdalena and the other double-e students that working for Apple would be like “an extension of college.” Magdalena was one of sixteen students who showed up for the interview.
If venture didn’t diversify, tech couldn’t diversify. She believed that women were more likely to fund other women and that more women were needed at the decision-making table. Once she defined the focus on venture, Melinda wanted to understand what was keeping women out of the industry and what were the pathways in. Melinda had been drawn to tech personally because of an outstanding woman math teacher who was able to get ten Apple II computers into her Catholic girls’ school. She remembered the teacher asking the girls whether they wanted to learn to code in BASIC. Melinda took to coding immediately, finding it like solving a puzzle, something she’d always loved. She then got a summer job teaching kids how to program in LOGO. She attended Duke because the university had a grant from IBM for two big computer labs. She had worked at Microsoft for nine years and loved her job.
Brotopia: Breaking Up the Boys' Club of Silicon Valley by Emily Chang
23andMe, 4chan, Ada Lovelace, affirmative action, Airbnb, Apple II, augmented reality, autonomous vehicles, barriers to entry, Bernie Sanders, Burning Man, California gold rush, Chuck Templeton: OpenTable:, David Brooks, Donald Trump, Elon Musk, equal pay for equal work, Ferguson, Missouri, game design, gender pay gap, Google Glasses, Google X / Alphabet X, Grace Hopper, high net worth, Hyperloop, Jeff Bezos, job satisfaction, Khan Academy, Lyft, Marc Andreessen, Mark Zuckerberg, Maui Hawaii, Menlo Park, meta analysis, meta-analysis, microservices, paypal mafia, Peter Thiel, post-work, pull request, ride hailing / ride sharing, rolodex, Saturday Night Live, shareholder value, side project, Silicon Valley, Silicon Valley startup, Skype, Snapchat, Steve Jobs, Steve Wozniak, Steven Levy, subscription business, Tim Cook: Apple, Travis Kalanick, uber lyft, women in the workforce
As the number of overall computer science degrees picked back up leading into the dot-com boom, more men than women were filling those coveted seats. In fact, the percentage of women in the field would dramatically decline for the next two and a half decades. APPLE UPSETS THE NERD CART As women were leaving the tech world, a new type of tech hero was taking center stage. In 1976, Apple was co-founded by Steve Wozniak, your typical nerd, and Steve Jobs, who was not your typical nerd at all. Jobs exuded a style and confidence heretofore unseen in the computer industry. He had few technical skills—Wozniak handled all that—yet Jobs was a never-before-seen kind of tech rock star. He proved you could rise on the strength of other skills, such as conviction, product vision, marketing genius, and a willingness to take risks.
The cherry on top: when the troll responded inappropriately to a tweet in which I had tagged IBM CEO Ginni Rometty, after an interview I had conducted with her, Rometty herself was alerted with several cheerful notifications from Twitter. I’ve developed the requisite thick skin, and I use a common tactic for dealing with trolls: ignoring them. I quickly scroll past the vitriolic direct replies to my Twitter account, and I never, ever use Reddit. Once an interview I conducted with Apple’s co-founder Steve Wozniak ended up on Reddit, and the response was worse than unnerving. (For the same reason, many women in tech avoid using Hacker News, the prominent start-up incubator YCombinator’s official bulletin board that has since become one of the industry’s leading message boards; the trolls are there too.) Most important, I don’t respond to the haters. This is accepted wisdom among many female users: the worst way to deal with a troll is to poke it.
Cheryan referenced a quotation from research performed by Margolis from a young female computer science student expressing her perceived distance from tech capability more simply. “Oh, my gosh, this isn’t for me,” she said. “I don’t dream in code like they do.” WOMEN’S NARROW PATH GETS NARROWER Shy, antisocial boys in their coding caves weren’t glamorous, but starting in the late 1970s and early 1980s, the computer business suddenly was. It began when Apple released the Apple II and continued when, a couple of years later, IBM came out with the PC. In 1984, Apple brought the groundbreaking Macintosh to market, and in 1985 Microsoft released Windows 1.0. Thanks to these new machines and the realization that there were fortunes to be made, the field was suddenly heady with excitement. As computers gained new status and exploded in popularity, hacker conferences and computer clubs sprang up across the San Francisco Bay Area, and enrollment in computer science classes surged at universities across the country.
Why Wall Street Matters by William D. Cohan
Apple II, asset-backed security, bank run, Bernie Sanders, Blythe Masters, bonus culture, break the buck, buttonwood tree, corporate governance, corporate raider, creative destruction, Credit Default Swap, Donald Trump, Exxon Valdez, financial innovation, financial repression, Fractional reserve banking, Gordon Gekko, greed is good, income inequality, Joseph Schumpeter, London Interbank Offered Rate, margin call, money market fund, moral hazard, Potemkin village, quantitative easing, secular stagnation, Snapchat, South Sea Bubble, Steve Jobs, Steve Wozniak, too big to fail, WikiLeaks
Objectively speaking, we learn from the Apple prospectus that there would be no Apple, at least in its present form, without Wall Street. The prospectus explains that Apple had a relatively large group of early investors who supported the company from its inception in 1976, when Steve Jobs and Steve Wozniak, the two founders, “designed, developed and assembled the Apple I, a microprocessor-based computer consisting of a single printed circuit board.” On January 3, 1977, Apple incorporated; three months later, it introduced the Apple II, which was similar to the Apple I but with a keyboard and a plastic cover. For the nine months leading up to the end of September 1977, Apple had a profit of almost $45,000. But Apple had big ambitions, as the prospectus makes clear, and achieving those ambitions required capital.
Track Changes by Matthew G. Kirschenbaum
active measures, Apple II, Apple's 1984 Super Bowl advert, Bill Gates: Altair 8800, Buckminster Fuller, commoditize, computer age, corporate governance, David Brooks, dematerialisation, Donald Knuth, Douglas Hofstadter, Dynabook, East Village, en.wikipedia.org, feminist movement, forensic accounting, future of work, Google Earth, Gödel, Escher, Bach, Haight Ashbury, HyperCard, Jason Scott: textfiles.com, Joan Didion, John Markoff, John von Neumann, Kickstarter, low earth orbit, mail merge, Marshall McLuhan, Mother of all demos, New Journalism, Norman Mailer, pattern recognition, pink-collar, popular electronics, RAND corporation, rolodex, Ronald Reagan, self-driving car, Shoshana Zuboff, Silicon Valley, social web, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Ted Nelson, text mining, thinkpad, Turing complete, Vannevar Bush, Whole Earth Catalog, Y2K, Year of Magical Thinking
Nonetheless, at the very height of this period she found time to help start a Kaypro users’ group called Bad Sector, the same name she had unceremoniously given to her first computer. Users’ groups were one of the fixtures of early computer culture. The first and most famous of them all was the Homebrew Computer Club, which met in the auditorium of Stanford University’s linear accelerator. Adam Osborne was a member, and at one of those now quasi-legendary gatherings Steve Wozniak had demonstrated his prototype for what eventually became the Apple II computer. But users’ groups were all very “homebrew”; they tended to coalesce organically, their members finding each other through notices tacked up in computer shops (or on virtual bulletin boards), ads in newsletters, and word of mouth. Typically they were tied together by an interest in a common system or product. Members would swap tips and help one another troubleshoot.
(Lexitron’s was the first in 1971; Vydec’s in 1974 was the first to display a full page of text on the screen.) Beeching’s fanciful scenario would thus have already been familiar—mundane—to any office secretary who had been trained on Lexitron or its competitors’ equipment.2 The creation of a working “TV Typewriter” (TVT) also soon became a rite of passage for the home computer hobbyist. It was a key stepping stone for Steve Wozniak on the way to the Apple computer, and it featured prominently in the announcement for the first meeting of the Homebrew Computer Club: “Are you building your own computer? Terminal? T V Typewriter? I O [input/output] device? Or some other digital black-magic box?” read the ad that was posted around Silicon Valley in February 1975.3 Computers themselves, of course, compute: which is to say they work by fundamentally arithmetical principles.
Smith, and developed since 1990 by Eastgate Systems), and then the Web itself.67 Interactive fiction—so-called “text adventures,” which had a brief commercial vogue in the early 1980s through the success of a company named Infocom—have also increasingly been explored by scholars.68 Lori Emerson, among the best of recent academic critics, has carefully detailed the ways in which poets like bpNichol, Geof Huth, and Paul Zelevansky each leveraged the programmable capabilities of the early Apple II line of computers to craft innovative on-screen textual compositions.69 The texts thus produced are indeed striking, harbingers of a new aesthetic intimately tied to the procedural capabilities of digital media but also knowingly reaching back to Concrete Poetry, Surrealism, and other well-documented movements. Yet the Osborne 1, which debuted in 1981 (the year the Apple II became the best-selling computer on the consumer market), was likewise an important platform for writing, as we have already seen. One motivation in my labeling of word processing as a literary subject is to balance the preponderance of critical and historical attention already devoted to those relatively few writers who, as Emerson relates (after John Cage), viewed the computer as a labor-making device—allowing for bold but sometimes rarefied experiments—rather than as the far more commonplace labor-saving device it was for most users.
What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry by John Markoff
Any sufficiently advanced technology is indistinguishable from magic, Apple II, back-to-the-land, beat the dealer, Bill Duvall, Bill Gates: Altair 8800, Buckminster Fuller, California gold rush, card file, computer age, computer vision, conceptual framework, cuban missile crisis, different worldview, Donald Knuth, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Thorp, El Camino Real, Electric Kool-Aid Acid Test, general-purpose programming language, Golden Gate Park, Hacker Ethic, hypertext link, informal economy, information retrieval, invention of the printing press, Jeff Rulifson, John Markoff, John Nash: game theory, John von Neumann, Kevin Kelly, knowledge worker, Mahatma Gandhi, Menlo Park, Mother of all demos, Norbert Wiener, packet switching, Paul Terrell, popular electronics, QWERTY keyboard, RAND corporation, RFC: Request For Comment, Richard Stallman, Robert X Cringely, Sand Hill Road, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, speech recognition, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, The Hackers Conference, Thorstein Veblen, Turing test, union organizing, Vannevar Bush, Whole Earth Catalog, William Shockley: the traitorous eight
Terminal, TV typewriter? I/O Device? or some other digital black-magic box? Or are you buying time on a time-sharing service? If so you might like to come to a gathering of people with like-minded interests. Exchange information, swap ideas, talk shop, help work on a project, whatever…13 One person who saw the flyer was Allen Baum, who was working at Hewlett-Packard at the time with his friend Steve Wozniak. The two had met in high school when Baum had seen Wozniak sitting in his homeroom class drawing strange graphics in a notebook. “What are you doing?” Baum asked. “I’m designing a computer,” was Wozniak’s reply. It turned out that Baum had on his own become intrigued with computers just months earlier after his father, who had moved the family from the East Coast, took a job at Stanford Research Institute.
This interview is the clearest and most comprehensive account of Engelbart’s career, and I have relied on it extensively. 2.Ibid. 3.Ibid. 4.There is some confusion on this point. At various times Engelbart has said that he found the original article in the library and at other times he has said he believed he first read the Life account of Vannevar Bush’s Memex. Whatever the case, it had a defining impact on him. 5.Vannevar Bush, “As We May Think,” Atlantic Monthly, July 1945. 6.Lowood and Adams, oral history. 7.Ibid. Twenty years later, a young Steve Wozniak, then a brand-new HP engineer, would ask the company if they wanted to sell a personal computer. HP said it wasn’t interested, and Wozniak went off to cofound Apple Computer. It was the second time the Silicon Valley pioneer missed an opportunity to define the future of computing. 8.Ibid. 9.Jack Goldberg, Stanford Research Institute, e-mail to author. 10.Author interview, Charles Rosen, Menlo Park, Calif., October 10, 2001. 11.Douglas C.
A frustrated Engelbart began to explore the idea of remotely connecting to the SDC computer from the Control Data minicomputer in Menlo Park using an early modem. Unfortunately his engineers were never able to make the system communicate reliably. As a result, for the next two years Engelbart’s fledgling Augmented Human Intellect Research Center began to build his system on a computer that had far less processing power than an Apple II of a decade and a half later. The Menlo Park computer used the magnetic-core memory that Engelbart, Crane, and English had all worked on improving in the fifties. It had a capacity of eight thousand twelve-bit characters—a little more than three pages of typed text—in its main memory. Instead of on a disk drive, it stored information permanently on a rotating drum that could hold thirty-two thousand characters.
Dealers of Lightning by Michael A. Hiltzik
Apple II, Apple's 1984 Super Bowl advert, beat the dealer, Bill Duvall, Bill Gates: Altair 8800, business cycle, computer age, creative destruction, Douglas Engelbart, Dynabook, Edward Thorp, El Camino Real, index card, Jeff Rulifson, John Markoff, Joseph Schumpeter, Marshall McLuhan, Menlo Park, oil shock, popular electronics, Robert Metcalfe, Ronald Reagan, Silicon Valley, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, the medium is the message, Vannevar Bush, Whole Earth Catalog, zero-sum game
The idea was not wholly implausible. Apple was coming on strong. Started in the proverbial Silicon Valley garage by Jobs and his high school classmate Steve Wozniak, Apple had successfully negotiated the transition in its product line from kit versions of Woz’s little personal computer to a more versatile version, the Apple II. This machine was unique in the hobbyist market. It came already assembled, with a keyboard (although it required a separate monitor). Shortly after Jobs’s appearance before Zarem’s group, Apple started bundling it with VisiCalc, a unique software program known as a financial spreadsheet—a “killer app” that would single-handedly turn the Apple II into a popular businessman’s tool. With fewer than forty employees in 1978, Apple was already one of the most sought-after investments among the small community of speculative private investors known as venture capitalists.
January: The Altair 8800, a hobbyist’s personal computer sold as a mail-order kit, is featured on the cover of Popular Electronics, enthralling a generation of youthful technology buffs—among them, Bill Gates—with the possibilities of personal computing. February: PARC engineers demonstrate for their colleagues a graphical user interface for a personal computer, including icons and the first use of pop-up menus, that will develop into the Windows and Macintosh interfaces of today. March 1: PARC’s permanent headquarters at 3333 Coyote Hill Road are formally opened. January 3: Apple Computer is incorporated by Steve Jobs and Steve Wozniak. August: Having perfected a new technology for designing high-density computer chips at PARC, Lynn Conway and Carver Mead begin drafting Introduction to VLSI Systems, a textbook on the technology that is written and typeset entirely on desktop publishing systems invented at the center. August 18: Xerox shelves a plan to market the Alto as a commercial project, closing the door to any possibility that the company will be in the vanguard of personal computing.
With BravoX nearing completion he was unsure of his next step, especially given the absence of any sign that Xerox meant to follow up ASD’s market probes with a full-scale merchandising program. He had only grown more restive when a friend showed him an Apple II running VisiCalc. The spreadsheet program was new to him but dazzling in its power. One typed numbers or formulas into the cells of a grid and linked them, so the answer from one cell could be part of the formula of another. This allowed anyone to tabulate data in an infinite number of permutations. It was particularly valuable for businessmen and engineers, who could perform “what-if” analyses simply by altering a figure here or there and letting the grid automatically calculate the myriad ramifications of the change. Sure enough, within months VisiCalc had transformed the Apple II into a commercial sensation. By contrast, at PARC, where funds had flowed so limitlessly that no one ever felt the urge to run “what-if” budget scenarios, the spreadsheet idea had not even occurred to the greatest software engineers in the world.
The Attention Merchants: The Epic Scramble to Get Inside Our Heads by Tim Wu
1960s counterculture, Affordable Care Act / Obamacare, AltaVista, Andrew Keen, anti-communist, Apple II, Apple's 1984 Super Bowl advert, barriers to entry, Bob Geldof, borderless world, Brownian motion, Burning Man, Cass Sunstein, citizen journalism, colonial rule, East Village, future of journalism, George Gilder, Golden Gate Park, Googley, Gordon Gekko, housing crisis, informal economy, Internet Archive, Jaron Lanier, Jeff Bezos, jimmy wales, Live Aid, Mark Zuckerberg, Marshall McLuhan, McMansion, Nate Silver, Network effects, Nicholas Carr, placebo effect, post scarcity, race to the bottom, road to serfdom, Saturday Night Live, science of happiness, self-driving car, side project, Silicon Valley, slashdot, Snapchat, Steve Jobs, Steve Wozniak, Steven Levy, Ted Nelson, telemarketer, the built environment, The Chicago School, the scientific method, The Structural Transformation of the Public Sphere, Tim Cook: Apple, Torches of Freedom, Upton Sinclair, upwardly mobile, white flight, zero-sum game
They made easier the entry into the home of not just more consoles, but also home computers, like the Apple II or the Commodore 64, for it was one thing to buy an expensive machine that supposedly would be used for work or programming; it was another to get all that along with the spoonful of sugar, namely, a machine that also came with even better games than the Atari had. In this way video games were arguably the killer app—the application that justifies the investment—of many computers in the home. As a game machine, sometimes used for other purposes, computers had gained their foothold. There they would lie for some time, a sleeping giant.7 * * * * Breakout was written by Apple’s cofounders, Steve Wozniak and Steve Jobs, as a side project, as described in the Master Switch, chapter 20. CHAPTER 16 AOL PULLS ’EM IN In 1991, when Steve Case, just thirty-three years old, was promoted to CEO of AOL, there were four companies, all but one lost to history, that shared the goal of trying to get Americans to spend more leisure time within an abstraction known as an “online computer network.”
While still primitive in various ways, and still offering nothing like the draw of television, the computer, the third screen, had arrived. In the end, AOL was no corporate Ozymandias; though a failure, it would have a lasting and monumental legacy. True to its name, it got America online—reaching out to one another, ready for the biggest attention harvest since television. * * * *1 Before this, personal computers had come in the now unrecognizable form of hobbyist kits, assembled and programmed by guys like Steve Wozniak of Apple. For more, see The Master Switch, 274–75. *2 Example: +++, ATDT (416) 225-9492. *3 The movie also proved an opportunity for the first meetings between AOL and Time Warner executives: Steve Case and Jerry Levin met at a White House screening of the film. See The Master Switch, chapter 19. *4 The “floppy” disk was a magnetic storage medium used in the 1980s and early 1990s, originally the size of a dinner napkin, that was inserted into a “disk drive” that resembled a toaster.
But most who had a computer usually kept it in the den or basement; the machine itself was unwieldy and ugly, consisting of a large, boxy body and a screen smaller than today’s laptops. In an age before widespread use of graphical interfaces like Windows, a glowing text, orange or green, was still what one faced; it had been that way since the first fully assembled home computers with dedicated screens, the Apple II and the Commodore PET, were marketed in 1977.*1 As for a mouse, that was still a creature known to reside in a small hole. Meanwhile, just as today, the television remained prominent in the living room, with its large attractive screen and dozens of channels; to use it required little expertise, nothing of the sort of arcane knowledge needed to operate a modem. Thus even to speak of the computer as a competitor to television in 1991 would have been a laughable proposition.
Overcomplicated: Technology at the Limits of Comprehension by Samuel Arbesman
algorithmic trading, Anton Chekhov, Apple II, Benoit Mandelbrot, citation needed, combinatorial explosion, Danny Hillis, David Brooks, digital map, discovery of the americas, en.wikipedia.org, Erik Brynjolfsson, Flash crash, friendly AI, game design, Google X / Alphabet X, Googley, HyperCard, Inbox Zero, Isaac Newton, iterative process, Kevin Kelly, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, mandelbrot fractal, Minecraft, Netflix Prize, Nicholas Carr, Parkinson's law, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman: Challenger O-ring, Second Machine Age, self-driving car, software studies, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, Stewart Brand, superintelligent machines, Therac-25, Tyler Cowen: Great Stagnation, urban planning, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Y2K
A self-taught genius who worked during the early part of the twentieth century, Ramanujan was not your average mathematician who tried to solve problems through trial and error and occasional flashes of brilliance. Instead, equations seemed to leap fully formed from his brain, often mind-bogglingly complex and stunningly correct (though some were also wrong). The Ramanujan of technology might be Steve Wozniak. Wozniak programmed the first Apple computer and was responsible for every aspect of the Apple II. As the programmer and novelist Vikram Chandra notes, “Every piece and bit and byte of that computer was done by Woz, and not one bug has ever been found. . . . Woz did both hardware and software. Woz created a programming language in machine code. Woz is hardcore.” Wozniak was on a level of technological understanding that few can reach. We can even see the extremes of our brain’s capacity—as well as how its limits can be stretched—in the way London cabdrivers acquire and use what is known as The Knowledge.
abstraction, 163 biological thinking’s avoidance of, 115–16 in complexity science, 133, 135 in physics thinking, 115–16, 121–22, 128 specialization and, 24, 26–27 technological complexity and, 23–28, 81, 121–22 accretion, 65 in complex systems, 36–43, 51, 62, 65, 191 in genomes, 156 in infrastructure, 42, 100–101 legacy systems and, 39–42 in legal system, 40–41, 46 in software, 37–38, 41–42, 44 in technological complexity, 130–31 unexpected behavior and, 38 aesthetics: biological thinking and, 119 and physics thinking, 113, 114 aggregation, diffusion-limited, 134–35 algorithm aversion, 5 Amazon, 5 American Philosophical Society, 90 Anaximander of Miletus, 139 Apple, 161, 163 Apple II computer, 77 applied mathematics, 143 arche, 140 Ariane 5 rocket, 1996 explosion of, 11–12 Aristotle, 151 Ascher, Kate, 100 Asimov, Isaac, 124 atomic nucleus, discovery of, 124, 141 Audubon, John James, 109 autocorrect, 5, 16 automobiles: self-driving, 91, 231–32 software in, 10–11, 13, 45, 65, 100, 174 see also Toyota automobiles Autonomous Technology (Winner), 22 Average Is Over (Cowen), 84 awe, as response to technological complexity, 6, 7, 154–55, 156, 165, 174 bacteria, 124–25 Balkin, Jack, 60–61 Ball, Philip, 12, 87–88, 136, 140 Barr, Michael, 10 Barrow, Isaac, 89 BASIC, 44–45 Bayonne Bridge, 46 Beacock, Ian, 12–13 Benner, Steven, 119 “Big Ball of Mud” (Foote and Yoder), 201 binary searches, 104–5 biological systems, 7 accretion in, 130–31 complexity of, 116–20, 122 digital technology and, 49 kluges in, 119 legacy code in, 118, 119–20 modules in, 63 tinkering in, 118 unexpected behavior in, 109–10, 123–24 biological thinking, 222 abstraction avoided in, 115–16 aesthetics and, 119 as comfortable with diversity and complexity, 113–14, 115 concept of miscellaneous in, 108–9, 140–41, 143 as detail oriented, 121, 122, 128 generalization in, 131–32 humility and, 155 physics thinking vs., 114–16, 137–38, 142–43, 222 technological complexity and, 116–49, 158, 174 Blum, Andrew, 101–2 Boeing 777, 99 Bogost, Ian, 154 Bookout, Jean, 10 Boorstin, Daniel, 89 Borges, Jorge Luis, 76–77, 131 Boston, Mass., 101, 102 branch points, 80–81 Brand, Stewart, 39–40, 126, 198–99 Brookline, Mass., 101 Brooks, David, 155 Brooks, Frederick P., Jr., 38, 59, 93 bugs, in software, see software bugs bureaucracies, growth of, 41 cabinets of curiosities (wunderkammers), 87–88, 140 calendar application, programming of, 51–53 Cambridge, Mass., 101 cancer, 126 Carew, Diana, 46 catastrophes, interactions in, 126 Challenger disaster, 9, 11, 12, 192 Chandra, Vikram, 77 Chaos Monkey, 107, 126 Chekhov, Anton, 129 Chekhov’s Gun, 129 chess, 84 Chiang, Ted, 230 clickstream, 141–42 Clock of the Long Now, The (Brand), 39–40 clouds, 147 Code of Federal Regulations, 41 cognitive processing: of language, 73–74 limitations on, 75–76, 210 nonlinear systems and, 78–79 outliers in, 76–77 working memory and, 74 see also comprehension, human collaboration, specialization and, 91–92 Commodore VIC-20 computer, 160–61 complexity, complex systems: acceptance of, see biological thinking accretion in, 36–43, 51, 62, 65, 191 aesthetics of, 148–49, 156–57 biological systems and, 116–17, 122 buoys as examples of, 14–15, 17 complication vs., 13–15 connectivity in, 14–15 debugging of, 103–4 edge cases in, 53–62, 65, 201, 205 feedback and, 79, 141–45 Gall on, 157–58, 227 hierarchies in, 27, 50–51 human interaction with, 163 infrastructure and, 100–101 inherent vs. accidental, 189 interaction in, 36, 43–51, 62, 65, 146 interconnectivity of, see interconnectivity interpreters of, 166–67, 229 kluges as inevitable in, 34–36, 62–66, 127 in legal systems, 85 and limits of human comprehension, 1–7, 13, 16–17, 66, 92–93 “losing the bubble” and, 70–71, 85 meaning of terms, 13–20 in natural world, 107–10 scientific models as means of understanding, 165–67 specialization and, 85–93 unexpected behavior in, 27, 93, 96–97, 98–99, 192 see also diversity; technological complexity complexity science, 132–38, 160 complication, complexity vs., 13–15 comprehension, human: educability of, 17–18 mystery and, 173–74 overoptimistic view of, 12–13, 152–53, 156 wonder and, 172 see also cognitive processing comprehension, human, limits of, 67, 212 complex systems and, 1–7, 13, 16–17, 66, 92–93 humility as response to, 155–56 interconnectivity and, 78–79 kluges and, 42 legal system and, 22 limitative theorems and, 175 “losing the bubble” in, 70–71, 85 Maimonides on, 152 stock market systems and, 26–27 technological complexity and, 18–29, 69–70, 80–81, 153–54, 169–70, 175–76 unexpected behavior and, 18–22, 96–97, 98 “Computational Biology” (Doyle), 222 computational linguistics, 54–57 computers, computing: complexity of, 3 evolutionary, 82–84, 213 impact on technology of, 3 see also programmers, programming; software concealed electronic complexity, 164 Congress, U.S., 34 Constitution, U.S., 33–34 construction, cost of, 48–50 Cope, David, 168–69, 229–30 corpus, in linguistics, 55–56 counting: cognitive limits on, 75 human vs. computer, 69–70, 97, 209 Cowen, Tyler, 84 Cryptonomicon (Stephenson), 128–29 “Crystalline Structure of Legal Thought, The” (Balkin), 60–61 Curiosity (Ball), 87–88 Dabbler badge, 144–45 dark code, 21–22 Darwin, Charles, 115, 221, 227 Daston, Lorraine, 140–41 data scientists, 143 datasets, massive, 81–82, 104–5, 143 debugging, 103–4 Deep Blue, 84 diffusion-limited aggregation (DLA), 134–35 digital mapping systems, 5, 49, 51 Dijkstra, Edsger, 3, 50–51, 155 “Divers Instances of Peculiarities of Nature, Both in Men and Brutes” (Fairfax), 111–12 diversity, 113–14, 115 see also complexity, complex systems DNA, see genomes Doyle, John, 222 Dreyfus, Hubert, 173 dwarfism, 120 Dyson, Freeman, on unity vs. diversity, 114 Dyson, George, 110 Economist, 41 edge cases, 53–62, 65, 116, 128, 141, 201, 205, 207 unexpected behavior and, 99–100 see also outliers Einstein, Albert, 114 Eisen, Michael, 61 email, evolution of, 32–33 emergence, in complex systems, 27 encryption software, bugs in, 97–98 Enlightenment, 23 Entanglement, Age of, 23–29, 71, 92, 96, 97, 165, 173, 175, 176 symptoms of, 100–102 Environmental Protection Agency, 41 evolution: aesthetics and, 119 of biological systems, 117–20, 122 of genomes, 118, 156 of technological complexity, 127, 137–38 evolutionary computation, 82–84, 213 exceptions, see edge cases; outliers Facebook, 98, 189 failure, cost of, 48–50 Fairfax, Nathanael, 111–12, 113, 140 fear, as response to technological complexity, 5, 7, 154–55, 156, 165 Federal Aviation Administration (FAA), Y2K bug and, 37 feedback, 14–15, 79, 135 Felsenstein, Lee, 21 Fermi, Enrico, 109 Feynman, Richard, 9, 11 field biologists, 122 for complex technologies, 123, 126, 127, 132 financial sector: interaction in, 126 interconnectivity of, 62, 64 see also stock market systems Firthian linguistics, 206 Flash Crash (2010), 25 Fleming, Alexander, 124 Flood, Mark, 61, 85 Foote, Brian, 201 Fortran, 39 fractals, 60, 61, 136 Frederick the Great, king of Prussia, 89 fruit flies, 109–10 “Funes the Memorious” (Borges), 76–77, 131 Galaga, bug in, 95–96, 97, 216–17 Gall, John, 157–58, 167, 227 game theory, 210 garden path sentences, 74–75 generalists, 93 combination of physics and biological thinking in, 142–43, 146 education of, 144, 145 explosion of knowledge and, 142–49 specialists and, 146 as T-shaped individuals, 143–44, 146 see also Renaissance man generalization, in biological thinking, 131–32 genomes, 109, 128 accretion in, 156 evolution of, 118, 156 legacy code (junk) in, 118, 119–20, 222 mutations in, 120 RNAi and, 123–24 Gibson, William, 176 Gingold, Chaim, 162–63 Girl Scouts, 144–45 glitches, see unexpected behavior Gmail, crash of, 103 Gödel, Kurt, 175 “good enough,” 27, 42, 118, 119 Goodenough, Oliver, 61, 85 Google, 32, 59, 98, 104–5 data centers of, 81–82, 103, 189 Google Docs, 32 Google Maps, 205 Google Translate, 57 GOTO command, 44–45, 81 grammar, 54, 57–58 gravitation, Newton’s law of, 113 greeblies, 130–31 Greek philosophy, 138–40, 151 Gresham College, 89 Guide of the Perplexed, The (Maimonides), 151 Haldane, J.
The Rise of the Network Society by Manuel Castells
"Robert Solow", Apple II, Asian financial crisis, barriers to entry, Big bang: deregulation of the City of London, Bob Noyce, borderless world, British Empire, business cycle, capital controls, complexity theory, computer age, computerized trading, creative destruction, Credit Default Swap, declining real wages, deindustrialization, delayed gratification, dematerialisation, deskilling, disintermediation, double helix, Douglas Engelbart, Douglas Engelbart, edge city, experimental subject, financial deregulation, financial independence, floating exchange rates, future of work, global village, Gunnar Myrdal, Hacker Ethic, hiring and firing, Howard Rheingold, illegal immigration, income inequality, Induced demand, industrial robot, informal economy, information retrieval, intermodal, invention of the steam engine, invention of the telephone, inventory management, James Watt: steam engine, job automation, job-hopping, John Markoff, knowledge economy, knowledge worker, labor-force participation, laissez-faire capitalism, Leonard Kleinrock, longitudinal study, low skilled workers, manufacturing employment, Marc Andreessen, Marshall McLuhan, means of production, megacity, Menlo Park, moral panic, new economy, New Urbanism, offshore financial centre, oil shock, open economy, packet switching, Pearl River Delta, peer-to-peer, planetary scale, popular capitalism, popular electronics, post-industrial society, postindustrial economy, prediction markets, Productivity paradox, profit maximization, purchasing power parity, RAND corporation, Robert Gordon, Robert Metcalfe, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, social software, South China Sea, South of Market, San Francisco, special economic zone, spinning jenny, statistical model, Steve Jobs, Steve Wozniak, Ted Nelson, the built environment, the medium is the message, the new new thing, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, total factor productivity, trade liberalization, transaction costs, urban renewal, urban sprawl, zero-sum game
In 1975, Ed Roberts, an engineer who had created a small calculator company, MITS, in Albuquerque, New Mexico, built a computing box with the improbable name of Altair, after a character in the Star Trek TV series, that was the object of admiration of the inventor’s young daughter. The machine was a primitive object, but it was built as a small-scale computer around a microprocessor. It was the basis for the design of Apple I, then of Apple II, the first commercially successful micro-computer, realized in the garage of their parents’ home by two young school drop-outs, Steve Wozniak and Steve Jobs, in Menlo Park, Silicon Valley, in a truly extraordinary saga that has by now become the founding legend of the Information Age. Launched in 1976, with three partners and $91,000 capital, Apple Computers had by 1982 reached $583 million in sales, ushering in the age of diffusion of computer power. IBM reacted quickly: in 1981 it introduced its own version of the microcomputer, with a brilliant name: the Personal Computer (PC), which became in fact the generic name for microcomputers.
But a few indications seem to point to the fact that they were intentionally trying to undo the centralizing technologies of the corporate world, both out of conviction and as their market niche. As evidence, I recall the famous Apple Computer 1984 advertising spot to launch Macintosh, in explicit opposition to Big Brother IBM of Orwellian mythology. As for the countercultural character of many of these innovators, I shall also refer to the life story of the genius developer of the personal computer, Steve Wozniak: after quitting Apple, bored by its transformation into another multinational corporation, he spent a fortune for a few years subsidizing rock groups that he liked, before creating another company to develop technologies of his taste. At one point, after having created the personal computer, Wozniak realized that he had no formal education in computer sciences, so he enrolled at UC Berkeley.
In 1988, it could be estimated that “venture capital accounted for about one-half of the new product and service investment associated with the information and communication industry.”68 A similar process took place in the development of the microcomputer, which introduced an historical divide in the uses of information technology.69 By the mid-1970s Silicon Valley had attracted tens of thousands of bright young minds from around the world, coming to the excitement of the new technological Mecca in a quest for the talisman of invention and money. They gathered in loose groups, to exchange ideas and information on the latest developments. One such gathering was the Home Brew Computer Club, whose young visionaries (including Bill Gates, Steve Jobs, and Steve Wozniak) would go on to create in the following years up to 22 companies, including Microsoft, Apple, Comenco, and North Star. It was the club’s reading, in Popular Electronics, of an article reporting Ed Roberts’s Altair machine which inspired Wozniak to design a microcomputer, Apple I, in his Menlo Park garage in the summer of 1976. Steve Jobs saw the potential, and together they founded Apple, with a $91,000 loan from an Intel executive, Mike Markkula, who came in as a partner.
Age of Context: Mobile, Sensors, Data and the Future of Privacy by Robert Scoble, Shel Israel
Albert Einstein, Apple II, augmented reality, call centre, Chelsea Manning, cloud computing, connected car, Edward Snowden, Edward Thorp, Elon Musk, factory automation, Filter Bubble, G4S, Google Earth, Google Glasses, Internet of things, job automation, John Markoff, Kickstarter, lifelogging, Marc Andreessen, Mars Rover, Menlo Park, Metcalfe’s law, New Urbanism, PageRank, pattern recognition, RFID, ride hailing / ride sharing, Robert Metcalfe, Saturday Night Live, self-driving car, sensor fusion, Silicon Valley, Skype, smart grid, social graph, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Tesla Model S, Tim Cook: Apple, ubercab, urban planning, Zipcar
Some years will pass before people look back and try to understand how they ever could have lived without such a device. Scoble tells audiences it’s like seeing the first Apple IIs as they rolled off the assembly line in 1977: They were like nothing people had seen before, but you couldn’t do much with them. Decision makers at HP and Atari weren’t interested in cutting a deal with Steve Wozniak and Steve Jobs for rights to market their new computer—the new, highly personalized devices were obviously too radically different to sell in significant quantity. Yet, it turned out a lot of people wanted them and the Apple II kicked off a 20-year explosion of invention and productivity that we now remember as the PC revolution. Google Glass will do the same. How long will it take? We’re not sure.
The Secret War Between Downloading and Uploading: Tales of the Computer as Culture Machine by Peter Lunenfeld
Albert Einstein, Andrew Keen, anti-globalists, Apple II, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, Burning Man, business cycle, butterfly effect, computer age, creative destruction, crowdsourcing, cuban missile crisis, Dissolution of the Soviet Union, don't be evil, Douglas Engelbart, Douglas Engelbart, Dynabook, East Village, Edward Lorenz: Chaos theory, Fall of the Berlin Wall, Francis Fukuyama: the end of history, Frank Gehry, Grace Hopper, gravity well, Guggenheim Bilbao, Honoré de Balzac, Howard Rheingold, invention of movable type, Isaac Newton, Jacquard loom, Jane Jacobs, Jeff Bezos, John Markoff, John von Neumann, Kickstarter, Mark Zuckerberg, Marshall McLuhan, Mercator projection, Metcalfe’s law, Mother of all demos, mutually assured destruction, Nelson Mandela, Network effects, new economy, Norbert Wiener, PageRank, pattern recognition, peer-to-peer, planetary scale, plutocrats, Plutocrats, post-materialism, Potemkin village, RFID, Richard Feynman, Richard Stallman, Robert Metcalfe, Robert X Cringely, Schrödinger's Cat, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, Skype, social software, spaced repetition, Steve Ballmer, Steve Jobs, Steve Wozniak, Ted Nelson, the built environment, The Death and Life of Great American Cities, the medium is the message, Thomas L Friedman, Turing machine, Turing test, urban planning, urban renewal, Vannevar Bush, walkable city, Watson beat the top human players on Jeopardy!, William Shockley: the traitorous eight
Jobs and Gates started out when personal computing, that idea advanced by Licklider the Patriarch and Kay the Aquarian, was the province of a tiny group of obsessed hobbyists. It was a business, but one with a smaller market than ﬂy-ﬁshing. As teenagers in the 1970s, Jobs and Gates were part of this small group of hobbyists who purchased kits to make simple, programmable computers to use (and play with) at home. Jobs, along with Steve Wozniak, were members of the best-known group of these enthusiasts, the Homebrew Computer Club of Cupertino, California. Gates, who had been programming since he found himself able to get access to a DEC mainframe in high school, was already writing software professionally while he was a student at Harvard. Jobs and Gates, along with their collaborators and competitors in the mid-1970s, were positioned at a fulcrum point, when a diversion turned into a business.
What made them both rich and powerful was their ability to meld the attributes of the two generations that preceded them—fusing the hardheaded business logic of the Plutocrats with the visionary futurity of the Aquarians. 163 GENERATIONS Jobs and Gates have an interesting competitive history, leapfrogging each other in the quest to achieve “insane greatness,” in Jobs’s words, and global market preeminence, for Gates.21 Jobs and his partner, Wozniak, were the ﬁrst to make the leap from hobbyists to industrialists with their Apple computers, launched in 1976. It was the Apple II that really broke loose, in 1977, attracting a huge user base, and establishing Jobs and Wozniak as the ﬁrst publicly lauded millionaire whiz kids of Silicon Valley. As important as their early success with the Apple II was, however, their greatest impact came seven years later, when they took the inspiration of people like Engelbart and Kay, and created a mass-market personal computer that set a new standard for participation. Before we get to that, we need to return to 1976, and move from Silicon Valley to New Mexico, where Gates and his partners, including former Harvard friends Paul Allen and Steve Ballmer, were writing programs for the Altair computer.
He worked for early electronic games pioneer Atari in the late 1970s and visited Xerox PARC, where he saw the work infused with Engelbart and Kay’s Aquarian vision. This spirit resonated with Jobs, who at one point had taken a personal pilgrimage to India and lived in an ashram. But even more so, the meme of participation entered his head on those visits to PARC. The Apple II, released in 1977, was unique in having a graphics capability and a soundboard built in. Here was the ﬁrst major computer for the masses, designed from the start as a multimedia machine. These Apple IIs became the de facto machines in classrooms around the country, and without a doubt prepared a generation of computer users for what was to come. Jobs understood that the graphical user interface would open up a whole new range of applications to nonexpert users, but even more would expand that user community exponentially.
Geek Sublime: The Beauty of Code, the Code of Beauty by Vikram Chandra
Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple II, barriers to entry, Berlin Wall, British Empire, business process, conceptual framework, create, read, update, delete, crowdsourcing, don't repeat yourself, Donald Knuth, East Village, European colonialism, finite state, Firefox, Flash crash, glass ceiling, Grace Hopper, haute couture, iterative process, Jaron Lanier, John von Neumann, land reform, London Whale, Norman Mailer, Paul Graham, pink-collar, revision control, Silicon Valley, Silicon Valley ideology, Skype, Steve Jobs, Steve Wozniak, supercomputer in your pocket, theory of mind, Therac-25, Turing machine, wikimedia commons, women in the workforce
After such knowledge, reverence is the only proper emotion; the narrator tells his Big Boss that he can’t fix the error because he can’t find it. I didn’t feel comfortable hacking up the code of a Real Programmer.12 Despite the allusion above to “the *macho* side of programming,” the non-geek may not fully grasp that within the culture of programmers, Mel es muy macho. The Real Programmer squints his eyes, does his work, and rides into the horizon to the whistling notes of Ennio Morricone. To you, Steve Wozniak may be that cuddly penguin who was on a few episodes of Dancing with the Stars, and by all accounts, he really is the good, generous man one sees in interviews. But within the imaginations of programmers, Woz is also a hard man, an Original Gangsta: he wired together his television set and a keyboard and a bunch of chips on a circuit board and so created the Apple I computer. Then he realized he needed a programming language for the microprocessor he’d used, and none existed, so Woz—who had never taken a language-design class—read a couple of books, wrote a compiler, and then wrote a programming language called Integer BASIC in machine code.
And when we say “wrote” this programming language we mean that he wrote the assembly code in a paper notebook on the right side of the pages, and then transcribed it into machine code on the left.13 And he did all this while holding down a full-time job at Hewlett-Packard: “I designed two computers and cassette tape interfaces and printer interfaces and serial ports and I wrote a Basic and all this application software, I wrote demos, and I did all this moonlighting, all in a year.”14 That second computer was the Apple II, the machine that defined personal computing, that is on every list of the greatest computers ever made. Woz designed all the hardware and all the circuit boards and all the software that went into the Apple II, while the other Steve spewed marketing talk at potential investors and customers on the phone. Every piece and bit and byte of that computer was done by Woz, and not one bug has ever been found, “not one bug in the hardware, not one bug in the software.”15 The circuit design of the Apple II is widely considered to be astonishingly beautiful, as close to perfection as one can get in engineering. Woz did both hardware and software. Woz created a programming language in machine code.
University of Pennsylvania Almanac 42, no. 18 (1996): 4–7. Witzel, Michael. “On the Origin of the Literary Device of the Frame Story in Old Indian Literature.” In Hinduismus Und Buddhismus: Festschrift Für Ulrich Schneider, edited by Harry Falk, 380–414. Freiburg: Hedwig Falk, 1987. World Economic Forum. Global Gender Gap Report. October 23, 2012. http://www.weforum.org/issues/global-gender-gap. Wozniak, Steve. “And Then There Was Apple.” Apple II History. Accessed August 10, 2013. http://apple2history.org/museum/articles/ca8610/. Wright, Edmund, and John Daintith. A Dictionary of Computing. Online. Oxford University Press, 2008. http://www.oxfordreference.com/10.1093/acref/9780199234004.001.0001/acref-9780199234004-e-2050. Wujastyk, Dominik. “Indian Manuscripts.” In Manuscript Cultures: Mapping the Field, edited by Jörg Quenzer and Jan-Ulrich Sobisch.
The One Device: The Secret History of the iPhone by Brian Merchant
Airbnb, animal electricity, Apple II, Apple's 1984 Super Bowl advert, citizen journalism, Claude Shannon: information theory, computer vision, conceptual framework, Douglas Engelbart, Dynabook, Edward Snowden, Elon Musk, Ford paid five dollars a day, Frank Gehry, global supply chain, Google Earth, Google Hangouts, Internet of things, Jacquard loom, John Gruber, John Markoff, Jony Ive, Lyft, M-Pesa, MITM: man-in-the-middle, more computing power than Apollo, Mother of all demos, natural language processing, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shock, pattern recognition, peak oil, pirate software, profit motive, QWERTY keyboard, ride hailing / ride sharing, rolodex, Silicon Valley, Silicon Valley startup, skunkworks, Skype, Snapchat, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Tim Cook: Apple, Turing test, uber lyft, Upton Sinclair, Vannevar Bush, zero day
His head is clean-shaven and he sports a thick, graying mustache and a quick, mischievous smile. He grew up in Florida with a love of tinkering and gadgets; he was more Wozniak than Jobs, and experimented with hardware in his spare time. “I was a hacker, and hackers, well, from that era—hackers meant you could build a computer from scratch. So, I was building computers,” he says, some based on the motherboard designs of Steve Wozniak. “It was unfortunate, I’ll call it, to live in Florida, outside of where the [Silicon] Valley stuff was going on.” He graduated with a degree in electrical engineering from the Florida Institute of Technology and went to work for IBM. He stayed at the company for sixteen years, rising through the ranks thanks to his mastery of both hardware and software. In the 1980s, he joined an “advanced research team” that was in charge of engineering IBM’s first laptop computer and making it as small as possible.
He discovered that he could whistle at a certain frequency into his home phone and gain access to the long-distance operator, for free. John Draper, another legendary hacker who came to be known as Captain Crunch, found that the pitch of a toy whistle that came free in Cap’n Crunch cereal boxes could be used to open long-distance call lines; he built blue boxes, electronic devices that generated the tone, and demonstrated the technology to a young Steve Wozniak and his friend Steve Jobs. Jobs famously turned the blue boxes into his first ad hoc entrepreneurial effort; Woz built them, and Jobs sold them. The culture of hacking, reshaping, and bending consumer technologies to one’s personal will is as old as the history of those technologies. The iPhone is not immune. In fact, hackers helped push the phone toward adopting its most successful feature, the App Store.
Fix What You Hate From Steve Jobs to Jony Ive to Tony Fadell to Apple’s engineers, designers, and managers, there’s one part of the iPhone mythology that everyone tends to agree on: Before the iPhone, everyone at Apple thought cell phones “sucked.” They were “terrible.” Just “pieces of junk.” We’ve already seen how Jobs felt about phones that dropped calls. “Apple is best when it’s fixing the things that people hate,” Greg Christie tells me. Before the iPod, nobody could figure out how to use a digital music player; as Napster boomed, people took to carting around skip-happy portable CD players loaded with burned albums. And before the Apple II, computers were mostly considered too complex and unwieldy for the layperson. “For at least a year before starting on what would become the iPhone project, even internally at Apple, we were grumbling about how all of these phones out there were all terrible,” says Nitin Ganatra, who managed Apple’s email team before working on the iPhone. It was water-cooler talk. But it reflected a growing sense inside the company that since Apple had successfully fixed—transformed, then dominated—one major product category, it could do the same with another.
Makers by Chris Anderson
3D printing, Airbnb, Any sufficiently advanced technology is indistinguishable from magic, Apple II, autonomous vehicles, barriers to entry, Buckminster Fuller, Build a better mousetrap, business process, commoditize, Computer Numeric Control, crowdsourcing, dark matter, David Ricardo: comparative advantage, death of newspapers, dematerialisation, Elon Musk, factory automation, Firefox, future of work, global supply chain, global village, IKEA effect, industrial robot, interchangeable parts, Internet of things, inventory management, James Hargreaves, James Watt: steam engine, Jeff Bezos, job automation, Joseph Schumpeter, Kickstarter, Lean Startup, manufacturing employment, Mark Zuckerberg, means of production, Menlo Park, Network effects, private space industry, profit maximization, QR code, race to the bottom, Richard Feynman, Ronald Coase, Rubik’s Cube, self-driving car, side project, Silicon Valley, Silicon Valley startup, Skype, slashdot, South of Market, San Francisco, spinning jenny, Startup school, stem cell, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, supply-chain management, The Nature of the Firm, The Wealth of Nations by Adam Smith, transaction costs, trickle-down economics, Whole Earth Catalog, X Prize, Y Combinator
Writing in Wired,12 Steven Levy explained the connection, which led to the original Apple II in 1977: His dad, Paul—a machinist who had never completed high school—had set aside a section of his workbench for Steve, and taught him how to build things, disassemble them, and put them together. From neighbors who worked in the electronics firm in the Valley, he learned about that field—and also understood that things like television sets were not magical things that just showed up in one’s house, but designed objects that human beings had painstakingly created. “It gave a tremendous sense of self-confidence, that through exploration and learning one could understand seemingly very complex things in one’s environment,” he told [an] interviewer. Later, when Jobs and his Apple cofounder, Steve Wozniak, were members of the Homebrew Computer Club, they saw the potential of desktop tools—in this case the personal computer—to change not just people’s lives, but also the world.
Like IBM a generation ago, which went from corporate mainframes to personal computers, they are recognizing that their futures lie with regular folks. They are pivoting from professionals to everyone. In short, the Maker Movement has arrived. This nascent movement is less than seven years old, but it’s already accelerating as fast as the early days of the PC, where the garage tinkerers who were part of the Homebrew Computer Club in 1975 created the Apple II, the first consumer desktop computer, which led to desktop computing and the explosion of a new industry. Similarly, you can mark the beginnings of the Maker Movement with such signs as the 2005 launch of Make magazine, from O’Reilly, a legendary publisher of geek bibles, and the first Maker Faire gatherings in Silicon Valley. Another key milestone arrived with RepRap, the first open-source desktop 3-D printer, which was launched in 2007.
Indeed, in 1969 Honeywell even offered a $10,000 “kitchen computer” (official name: the “H316 Pedestal Model”), which was promoted on the cover of the Neiman-Marcus catalog to do just that—it was stylishly designed, with a built-in cutting board. (There is no evidence that any actually sold, not least because the very modern cook would have to enter data with toggle switches and read the recipes displayed in binary blinking lights.) Yet when the truly personal—“desktop”—computer did eventually arrive with the Apple II and then the IBM PC, countless uses quickly emerged, starting with the spreadsheet and word processor for business and quickly moving to entertainment with video games and communications. This was not because the wise minds of the big computer companies had finally figured out why people would want one, but because people found new uses all by themselves. Then, in 1985, Apple released the LaserWriter, the first real desktop laser printer, which, along with the Mac, started the desktop publishing phenomenon.
Cult of the Dead Cow: How the Original Hacking Supergroup Might Just Save the World by Joseph Menn
4chan, A Declaration of the Independence of Cyberspace, Apple II, autonomous vehicles, Berlin Wall, Bernie Sanders, bitcoin, Chelsea Manning, commoditize, corporate governance, Donald Trump, dumpster diving, Edward Snowden, Firefox, Google Chrome, Haight Ashbury, Internet of things, Jacob Appelbaum, Jason Scott: textfiles.com, John Markoff, Julian Assange, Mark Zuckerberg, Mitch Kapor, Naomi Klein, Peter Thiel, pirate software, pre–internet, Ralph Nader, ransomware, Richard Stallman, Robert Mercer, self-driving car, side project, Silicon Valley, Skype, slashdot, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Stuxnet, Whole Earth Catalog, WikiLeaks, zero day
Rosenbaum had spent serious time with the phone phreakers, the forerunners of today’s hackers, and he explained what they were doing in plain English. The phreakers were a diverse group, including John Draper, who called himself Cap’n Crunch after learning that whistles given out with that breakfast cereal could be used to blow 2600 hertz, which allowed free calls. The technical puzzles of phreaking would attract future innovators up to and including Apple founders Steve Jobs and Steve Wozniak, who sold blue boxes to make free calls while in college. The political divide in America at the end of the 1960s was the worst until the 2000s, and that helped push phreaking in a radical direction. The phone companies were very clearly part of the establishment, and AT&T was a monopoly to boot. That made it a perfect target for the antiwar left and anyone who thought stealing from some companies was more ethical than stealing from others.
Broderick’s character accidentally tapped into a military supercomputer. The budding hackers of Lubbock weren’t looking for trouble either. A couple of the older kids had set up electronic forums known as bulletin boards, where strangers, using modems to call in over regular phone lines, could read or leave messages and text files, which the locals also called t-files. Widespread use of web browsers was still a dozen years away. Kevin had put in two years on his Apple II by the time he moved to Lubbock, so he found the local bulletin boards in short order. There weren’t a lot in his 806 area code, and most were run by hobbyists talking about computers. Some older teenagers had one that was more freewheeling, and Kevin and a group of friends chatted there for a while, until the bigger kids got tired of the hangers-on and banned them. Kevin was indignant. “We have to make our own and truly be elite,” he told friends.
His parents started him off carrying a cigar box under his chin at two and a half, Mudge said, to get him used to putting a violin there. By the time he got to Berklee he was practicing five hours a day, a routine he compared to the grueling training of Chinese acrobats. But he was never just about music. David Zatko worked on the government’s space shuttle efforts and brought home computer parts to his toddler. With a $5,000 bequest from Mudge’s grandfather, the middle-class family bought an Apple II Plus, intending it to be educational. That it was, especially because a nearby store offered software that the buyer could return quickly for a partial refund. That made cracking the copy protection an imperative for Mudge and his father, and it was an early lesson in perverse incentives, a subject that Mudge would one day find himself debating in the Pentagon. Breaking the rights management on Apple software and games like Ultima IV “was our jigsaw puzzle,” Mudge said.
A Curious Mind: The Secret to a Bigger Life by Brian Grazer, Charles Fishman
4chan, Airbnb, Albert Einstein, Apple II, Asperger Syndrome, Bonfire of the Vanities, en.wikipedia.org, game design, Google Chrome, Howard Zinn, Isaac Newton, Jeff Bezos, Kickstarter, Norman Mailer, orbital mechanics / astrodynamics, out of africa, RAND corporation, Ronald Reagan, Silicon Valley, stem cell, Steve Jobs, Steve Wozniak, the scientific method, Tim Cook: Apple
Williams: former police chief of Los Angeles Marianne Williamson: spiritual teacher, New Age guru Ian Wilmut: embryologist, led the team of researchers who first successfully cloned a mammal (a sheep named Dolly) E. O. Wilson: biologist, author, professor emeritus at Harvard University, two-time winner of the Pulitzer Prize Oprah Winfrey: founder and chairwoman of the Oprah Winfrey Network, actress, author George C. Wolfe: playwright, theater director, two-time winner of the Tony Award Steve Wozniak: cofounder of Apple Inc., designer of Apple I and Apple II computers, inventor John D. Wren: president and CEO of marketing and communications company Omnicom Will Wright: game designer, creator of Sim City and The Sims Steve Wynn: businessman, Las Vegas casino magnate Gideon Yago: writer, former correspondent for MTV News Eitan Yardeni: teacher and spiritual counselor at the Kabbalah Centre Daniel Yergin: economist, author of The Prize: The Epic Quest for Oil, Money and Power, winner of the Pulitzer Prize Dan York: chief content officer at DirecTV, former president of content and advertising sales, AT&T Michael W.
The Perfect Thing: How the iPod Shuffles Commerce, Culture, and Coolness by Steven Levy
Apple II, British Empire, Claude Shannon: information theory, en.wikipedia.org, indoor plumbing, Internet Archive, Jeff Bezos, John Markoff, Joi Ito, Jony Ive, Kevin Kelly, Sand Hill Road, Saturday Night Live, Silicon Valley, social web, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, technology bubble, Thomas L Friedman
Two things always seem to evoke an indignant outburst of "It ain't natural!" One is drugs; the other is technology, applied so as to please ourselves. When the latter is used to get effects as mind-blowing as the former, things become really interesting. (One of the most memorable quotes I've ever gathered in my reporting career came in 1982, covering the US Festival, a huge rock concert sponsored by Apple cofounder Steve Wozniak. At a motel nearby, Jerry Garcia, who was prepping to play a "Breakfast with the Grateful Dead" set, proclaimed, "Technology is the new drugs." Okay, not an original concept, but consider the source.) Without altering one's chemical composition, the iPod does change your head. Plugging directly into your ears, dominating the Personal brain matter in between, and shuffling your music collection to extract constant delight, it generates a portable alternative reality, almost always more pleasant than the real one.
The Sony-ites in attendance, mostly younger engineers, technical designers, and marketers, responded with panic but executed with fervor, in part because the Walkman was a product they wanted to own themselves. They made the deadline. On June 22, 1979, Sony unveiled its baby with a degree of showmanship and suspense that Steve Jobs—then a twenty-four-year-old mogul-in-training, beginning to contemplate a successor to the Apple II—might have appreciated. Journalists arriving at Sony's headquarters in the Ginza were directed to buses and handed a Walkman. Not until they arrived at Yoyogi Park—Tokyo's version of Central Park, which was filled every Sunday with crazed Elvis impersonators-were they directed to turn the devices on, to hear a sonically thrilling stereo introduction to the capabilities and virtues of the Walkman.
It's not as though Jobs doesn't get respect; in the business mags he is the equivalent of Princess Diana as a cover subject. (Jobs considers cover stories his birthright and often grants exclusive access in exchange for getting Apple's new products on the cover.) But only recently, with the dual success of the iPod and Pixar, have people come to realize that Jobs is building a historical legacy. This is a guy who has pulled of[four accomplishments that rocked the world. With the Apple II, he was instrumental in introducing the concept of a personal computer to the world. With the Macintosh, he popularized what was to become the dominant—and friendliest—means of using a computer. As the CEO of Pixar, he helped usher in the era of computer-animated feature films. And now there is the iPod. Jobs himself looks back to the Macintosh effort as a peak. Other people involved in the effort look back to that period with a Camelot-type nostalgia.
Running Money by Andy Kessler
Andy Kessler, Apple II, bioinformatics, Bob Noyce, British Empire, business intelligence, buy and hold, buy low sell high, call centre, Corn Laws, Douglas Engelbart, family office, full employment, George Gilder, happiness index / gross national happiness, interest rate swap, invisible hand, James Hargreaves, James Watt: steam engine, joint-stock company, joint-stock limited liability company, knowledge worker, Leonard Kleinrock, Long Term Capital Management, mail merge, Marc Andreessen, margin call, market bubble, Maui Hawaii, Menlo Park, Metcalfe’s law, Mitch Kapor, Network effects, packet switching, pattern recognition, pets.com, railway mania, risk tolerance, Robert Metcalfe, Sand Hill Road, Silicon Valley, South China Sea, spinning jenny, Steve Jobs, Steve Wozniak, Toyota Production System, zero-sum game
I once asked Gordon Moore about the whole Microma experience. He quickly pulled up his sleeve and pointed to a Microma watch on his wrist and told me he wore it often to remind himself to never be that stupid again. Intel’s lesson: make the intellectual property, not the end product. The cool thing about a computer on a chip is you can start a computer company without knowing much about computers. Steve Jobs and Steve Wozniak created Apple Computer without knowing that much. Wozniak had to write some software to get data on and off a ﬂoppy disk drive, which no one else had, and their Apple I became a hit. IBM knew lots about how to milk big bucks out of big computers, but nothing about microprocessors. So a stealth group in Florida contracted out the work, creating a Frankensteinlike IBM PC in 1981, using an Intel microprocessor, Microsoft software and a Western Digital disk controller.
It was lunchtime at George Gilder’s Telecosm conference, and we were waiting for the featured speaker, Gary Winnick of Global Crossing, to explain how he sends billions of packets per second under the Atlantic Ocean. George Gilder has hosted his Telecosm conference for years. Tech luminaries like Carver Mead, Bob Metcalfe and Paul Allen were regulars. “I don’t know what the ﬁrst packet was,” I confessed. My tablemate turned out to be Leonard Kleinrock, a UCLA professor, according to his name tag. It turned out that he had been at the creation. Since the 1978 introduction of the Apple II computer, to the 1981 announcement of the IBM PC, the world has been ﬂooded with smaller, cheaper and faster computers. More than 100 million new ones get sold every year. But today, these are no islands—the power of these computers is in their ability to communicate. The telephone network, which is optimized for your talks with 184 Running Money Mom, was the medium for computer communications.
See Advanced Micro Devices American Federation of Information Processing Societies, Fall Joint Computer Conference (1968), 119–20, 123 America Online. See AOL Andreessen, Marc, 197, 199 animation, 134–35 AOL (America Online), 69–73, 207, 208, 223, 290 Cisco routers and, 199 Inktomic cache software and, 143 Netscape Navigator purchase, 201, 225 Telesave deal, 72–73 TimeWarner deal, 223, 229 as top market cap company, 111 Apache Web server, 247 Apple Computer, 45, 127, 128 Apple II, 183 Applied Materials, 245 Archimedes (propeller ship), 94 Arkwright, Richard, 65 ARPANET, 186, 187, 189, 191 Arthur Andersen, 290 Artists and Repertoire (A&R), 212, 216 Asian debt crisis, 3, 150, 151, 229, 260 yen and, 162–65, 168, 292 @ (at sign), 187 AT&T, 61, 185–86, 189 August Capital, 2, 4 auto industry, 267–68 Aziz, Tariq, 26 Babbage, Charles, 93 Baker, James, 26 Balkanski, Alex, 44, 249 bandwidth, 60, 111, 121, 140, 180, 188–89 Baran, Paul, 184, 185 Barbados, 251, 254 300 Index Barksdale, Jim, 198, 199–201 Barksdale Group, 201 BASE, 249 BASIC computer language, 126, 127 BBN.
The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop
Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Duvall, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, conceptual framework, cuban missile crisis, Donald Davies, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, James Watt: steam engine, Jeff Rulifson, John von Neumann, Leonard Kleinrock, Marc Andreessen, Menlo Park, New Journalism, Norbert Wiener, packet switching, pink-collar, popular electronics, RAND corporation, RFC: Request For Comment, Robert Metcalfe, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process, zero-sum game
On the hardware side, this challenge was taken up most famously by the Apple Computer Company, founded in 1976 by Homebrew Computer Club members Steve Wozniak and Steve Jobs, longtime buddies from the Silicon Val- ley town of Cupertino. After some encouraging success with their first com- puter, which they marketed through local hobby shops-it was actually just a single circuit board using the new, 8-bit 6502 microprocessor from MOS Tech- nology, plus 4 kilobytes of RAM-Jobs and Wozniak were joined by the thirty- four-year-old A. C. Markkula, formerly the marketing manager for Intel. Markkula, who had retired from that company two years earlier after earning 434 THE DREAM MACHINE more than a million dollars in stock options, bought a one-third partnership in Apple for $91,000 and began working his contacts to bring in venture capital and management expertise. The result was the Apple II, a much-upgraded, 6502- based micro that was introduced in April 1977 at the first West Coast Computer Faire in San Francisco.
What could a sclerotic bureaucracy like Xerox do that his own team couldn't do better and faster? Partly, this reaction was a function of his counter- cultural disdain for large corporations in general. (The scruffily bearded Jobs wore his T-shirts, jeans, and sandals like a badge of honor.) And partly it was at- tributable to his memory of Steve Wozniak's former employer, Hewlett-Packard, which had once rebuffed Woz's proposal for a microcomputer. But in any case, Jobs changed his mind after repeated urging by Apple engineer Jef Raskin, who had joined the company to help design the Apple II. Raskin had visited PARC, as it happened, and his friends there had shown him its wonders. So on April 2, 1979,Jobs and his team met with the XDC people and struck a deal that could make sense only in the go-go world of Silicon Valley: Xerox would be allowed to invest $1.05 million in Apple's private stock sale, and in re- turn it would allow Apple full access to PARC's technology.
And perhaps most important of all, it was great for playing video games; Wozniak, the technical wizard of the team and a video-game addict himself, had designed it with precisely that use in mind. Of course, the Apple II had plenty of competition in the consumer market, notably from the Commodore PET, which debuted at the same West Coast Computer Faire, and from the Tandy-Radio Shack TRS-80, which was intro- duced the following August. In the beginning, moreover, the promises made in the Regis McKenna ad copy-"You'll be able to organize, index and store data on household finances, income taxes, recipes, your biorhythms, balance your checking account, even control your home environment"-were little more than fantasy; the applications software that would work such magic didn't exist yet. Nonetheless, the Apple II was an instant hit. By decade's end Apple itself had become one of the fastest-growing companies in American history.
The Entrepreneurial State: Debunking Public vs. Private Sector Myths by Mariana Mazzucato
"Robert Solow", Apple II, banking crisis, barriers to entry, Bretton Woods, business cycle, California gold rush, call centre, carbon footprint, Carmen Reinhart, cleantech, computer age, creative destruction, credit crunch, David Ricardo: comparative advantage, demand response, deskilling, endogenous growth, energy security, energy transition, eurozone crisis, everywhere but in the productivity statistics, Financial Instability Hypothesis, full employment, G4S, Growth in a Time of Debt, Hyman Minsky, incomplete markets, information retrieval, intangible asset, invisible hand, Joseph Schumpeter, Kenneth Rogoff, Kickstarter, knowledge economy, knowledge worker, natural language processing, new economy, offshore financial centre, Philip Mirowski, popular electronics, profit maximization, Ralph Nader, renewable energy credits, rent-seeking, ride hailing / ride sharing, risk tolerance, shareholder value, Silicon Valley, Silicon Valley ideology, smart grid, Steve Jobs, Steve Wozniak, The Wealth of Nations by Adam Smith, Tim Cook: Apple, too big to fail, total factor productivity, trickle-down economics, Washington Consensus, William Shockley: the traitorous eight
While the products owe their beautiful design and slick integration to the genius of Jobs and his large team, nearly every state-of-the-art technology found in the iPod, iPhone and iPad is an often overlooked and ignored achievement of the research efforts and funding support of the government and military. Only about a decade ago Apple was best known for its innovative personal computer design and production. Established on 1 April 1976 in Cupertino, California by Steve Jobs, Steve Wozniak and Ronald Wayne, Apple was incorporated in 1977 by Jobs and Wozniak to sell the Apple I personal computer.1 The company was originally named Apple Computer, Inc. and for 30 years focused on the production of personal computers. On 9 January 2007, the company announced it was removing the ‘Computer’ from its name, reflecting its shift in focus from personal computers to consumer electronics.
In effect, the government and business community underestimated the challenge at hand, though critics tend to focus on the failure of government and not of finance; and that (3) failure is hard to judge unless we have proper metrics to be able to understand the spillover effects that investments have, even when there is no final product. These international projects did establish networks of learning between utilities, government R&D, the business community and universities. 3 Moreover, as discussed in Chapter 5, the Apple II, which ran Kenetech’s first projects, would also not have been possible without government investments. 4 Hoffman had acquired the original Bell Labs patent through acquisition of National Fabricated Products in 1956. 5 Details on Suntech are based on a forthcoming piece of work by Matt Hopkins and Yin Li, ‘The Rise of the Chinese Solar Photovoltaic Industry and its Impact on Competition and Innovation’.
From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry by Martin Campbell-Kelly
Apple II, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, business process, card file, computer age, computer vision, continuous integration, deskilling, Donald Knuth, Grace Hopper, information asymmetry, inventory management, John Markoff, John von Neumann, linear programming, longitudinal study, Menlo Park, Mitch Kapor, Network effects, popular electronics, RAND corporation, Robert X Cringely, Ronald Reagan, Silicon Valley, software patent, Steve Jobs, Steve Wozniak, Steven Levy, Thomas Kuhn: the structure of scientific revolutions
The transforming event for the personal computer was the launch of the Apple II in April 1977. The tiny firm of Apple Computer had been formed by the computer hobbyists Steve Jobs and Steve Wozniak in 1976. Their first machine, the Apple, was a raw computer board designed for kit-building hobbyists. The Apple II, however, was an unprecedented leap of imagination and packaging. Looking much like the computer terminals seen on airport reservation desks, it consisted of a keyboard, a CRT display screen, and a central processing unit, all in one package. Though Jobs was not alone in having such a vision of the personal computer, he was by far the most successful at orchestrating the technological and manufacturing resources to bring it to fruition. During 1977, the Apple II was joined by many imitators from entrepreneurial startups, and by machines from two major electronics manu- The Personal Computer Software Industry 203 facturers: the Commodore PET and the TRS-80.
The PFS:File database system was intended to occupy a perceived gap in the market for a mid-price database that would also exploit John Page’s background as the designer of the database software for the HP 3000 minicomputer. The package was developed for the Apple II, with ease of use rather than technical sophistication as its prime selling point. It was priced at $140, barely one-fifth the price of dBase II. Produced and published in house, the package was distributed directly to computer stores. All this was achieved in 1980, when the founders were still full-time employees of Hewlett-Packard. In early 1981, with the The Personal Computer Software Industry 221 success of PFS:File, they secured venture funding and incorporated as Software Publishing. By the fall of 1983, they had sold a quarter-million copies of PFS:File for Apple II, TRS-80, and IBM-compatible machines. Software Publishing continued to exploit the market niche for mid-price software with follow-on products such as PFS:Write and PFS:Graph.
In the boulders-pebbles-sand model of the software industry, the microcomputer-industry-specific application vendors were some of the finer grains of sand. Consumer Software When the Apple II was launched. in 1977, it was positioned as a “home/personal” computer. The advertising copy reflected that: The home computer that’s ready to work, play and grow with you. . . . You’ll be able to organize, index and store data on household finances, income taxes, recipes, your biorhythms, balance your checking account, even control your home environment.41 In fact, hardly any of those applications were achievable; the software did not exist. In 1977, the Apple II was too limited and too expensive for home use other than by the most dedicated enthusiast, so it was sold primarily to schools and businesses. However, during the period 1979–1981 many low-cost machines designed expressly for the domestic market were offered by Atari, Coleco, Commodore, Tandy, Texas Instruments, Timex, and Sinclair.
Augmented: Life in the Smart Lane by Brett King
23andMe, 3D printing, additive manufacturing, Affordable Care Act / Obamacare, agricultural Revolution, Airbnb, Albert Einstein, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, Apple II, artificial general intelligence, asset allocation, augmented reality, autonomous vehicles, barriers to entry, bitcoin, blockchain, business intelligence, business process, call centre, chief data officer, Chris Urmson, Clayton Christensen, clean water, congestion charging, crowdsourcing, cryptocurrency, deskilling, different worldview, disruptive innovation, distributed generation, distributed ledger, double helix, drone strike, Elon Musk, Erik Brynjolfsson, Fellow of the Royal Society, fiat currency, financial exclusion, Flash crash, Flynn Effect, future of work, gig economy, Google Glasses, Google X / Alphabet X, Hans Lippershey, Hyperloop, income inequality, industrial robot, information asymmetry, Internet of things, invention of movable type, invention of the printing press, invention of the telephone, invention of the wheel, James Dyson, Jeff Bezos, job automation, job-hopping, John Markoff, John von Neumann, Kevin Kelly, Kickstarter, Kodak vs Instagram, Leonard Kleinrock, lifelogging, low earth orbit, low skilled workers, Lyft, M-Pesa, Mark Zuckerberg, Marshall McLuhan, megacity, Metcalfe’s law, Minecraft, mobile money, money market fund, more computing power than Apollo, Network effects, new economy, obamacare, Occupy movement, Oculus Rift, off grid, packet switching, pattern recognition, peer-to-peer, Ray Kurzweil, RFID, ride hailing / ride sharing, Robert Metcalfe, Satoshi Nakamoto, Second Machine Age, selective serotonin reuptake inhibitor (SSRI), self-driving car, sharing economy, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, Skype, smart cities, smart grid, smart transportation, Snapchat, social graph, software as a service, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, TaskRabbit, technological singularity, telemarketer, telepresence, telepresence robot, Tesla Model S, The Future of Employment, Tim Cook: Apple, trade route, Travis Kalanick, Turing complete, Turing test, uber lyft, undersea cable, urban sprawl, V2 rocket, Watson beat the top human players on Jeopardy!, white picket fence, WikiLeaks
Leonard Kleinrock, UCLA, from an interview on the first ARPANET packet-switching test in 1969 In parallel to the development of early computer networks, various computer manufacturers set about shrinking and personalising computer technology so that it could be used at home or in the office. Contrary to popular belief, IBM wasn’t the first company to create a personal computer (PC). In the early 1970s, Steve Jobs and Steve Wozniak had been busy working on their own version of the personal computer. The result—the first Apple computer (retrospectively known as the Apple I)—actually preceded the IBM model6 by almost five years, and used a very different engineering approach. However, it wasn’t until Apple launched the Apple II that personal computing really became a “thing”. Figure 3.2: An original Apple I computer designed by Jobs and Wozniak and released in 19767 (Credit: Bonhams New York) Around the same time as Jobs and Wozniak’s development of the earliest form of PC, there was also a rapid downsizing of computers in the workplace.
I still vividly remember the time when I hacked into the school administrator system to find the teachers’ records; I got a two-week time out from the computer room for that one. I would offer to code other kids’ assignments for them for a nominal fee. It wasn’t about the money, it was simply a test to see if I could get the same results or output using different versions of the program. Around this time, my pal Dan Goldberg introduced me to my first Apple II computer, and not long after that I got my first Vic-20 microcomputer at home. A few years later, I convinced my father to invest in an IBM compatible computer for the home. I’d gone from punching in programs on paper cards that read graphite pencil marks to keyboards and monochrome screens. Interfaces, especially when it came to games or graphics, were extremely primitive. The Commodore Vic-20 microcomputer that I owned had about 4k of in-built ram, a 16-k expansion pack and a cassette tape deck for storing programs.
Rise of the Machines: A Cybernetic History by Thomas Rid
1960s counterculture, A Declaration of the Independence of Cyberspace, agricultural Revolution, Albert Einstein, Alistair Cooke, Apple II, Apple's 1984 Super Bowl advert, back-to-the-land, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, business intelligence, Charles Lindbergh, Claude Shannon: information theory, conceptual framework, connected car, domain-specific language, Douglas Engelbart, Douglas Engelbart, dumpster diving, Extropian, full employment, game design, global village, Haight Ashbury, Howard Rheingold, Jaron Lanier, job automation, John Markoff, John von Neumann, Kevin Kelly, Kubernetes, Marshall McLuhan, Menlo Park, Mitch Kapor, Mother of all demos, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shale / tar sands, pattern recognition, RAND corporation, Silicon Valley, Simon Singh, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Telecommunications Act of 1996, telepresence, The Hackers Conference, Vernor Vinge, Whole Earth Catalog, Whole Earth Review, Y2K, Yom Kippur War, Zimmermann PGP
They invented time-sharing, against the interest of large corporations, and gave more people access to SAGE-style supercomputers—in effect, turning mainframes into more widely accessible virtual personal computers. The second wave of hackers, in the late 1970s, overturned mainframes entirely by bringing the personal computer to market. Many of them were hard-core counterculture types—for instance, Steve Jobs and Steve Wozniak, two cofounders of Apple. They had honed their skills by developing, and then selling, so-called blue boxes, illegal phone phreaking devices to make free calls. Then came the third wave of “hackers,” the social hackers of the early 1980s. The personal computer and emerging network technology didn’t articulate an entire philosophy and aesthetic just by themselves. Of course, building software tools to connect and educate communities helped, and the then emerging free-software movement offered a promising platform.
For Christmas that year, Gibson finally bought an Apple II at a discount. The machine’s successor model, the Macintosh, had been launched so effectively nearly one year earlier with the legendary cyberpunk ad “1984,” but the older Apple II was still a best-selling device. When Gibson booted up the machine at home and got ready to use it, he was shocked by the computer’s mundane mechanical makeup. “Here I’d been expecting some exotic crystalline thing, a cyberspace deck or something, and what I’d gotten was something with this tiny piece of a Victorian engine in it, like an old record player.”46 The science fiction writer called up the store to complain. What was making this noise? The operator told him it was normal; the hard drive was simply spinning in the box that was the Apple II. Gibson’s ignorance about computers, he recounted, had allowed him to romanticize technology.
“Welcome to World War III, the Cybernetic War created by machines for machines.”1 The arsenals of cybernetic war were stocked with an array of fancy weaponry: cruise missiles, smart bombs, sophisticated intercontinental missiles with multiple warheads, and tools such as robotic pattern recognition, code, game theory, cryptography, and simulation. In 1979 these terms were still somewhat vague and undefined, all sounding equally futuristic. When Post’s article on the future of war came out, its analysis was at the cutting edge of technology. The US military was still reeling from defeat in Vietnam, a decidedly low-tech war. The Apple II had been released less than two years earlier, in June 1977. “What kind of man owns his own computer?” asked an Apple advertisement in Omni just after Post’s article: “Rather revolutionary, the idea of owning your own computer.”2 E-mail didn’t exist yet. Usenet, one of the world’s first computer network communication systems, had not been set up yet. CompuServe began offering a dial-up online information service to customers only four months later, in September 1979.3 Predicting ubiquitous networked computers the size of a book was a daring move.
This Machine Kills Secrets: Julian Assange, the Cypherpunks, and Their Fight to Empower Whistleblowers by Andy Greenberg
Apple II, Ayatollah Khomeini, Berlin Wall, Bill Gates: Altair 8800, Burning Man, Chelsea Manning, computerized markets, crowdsourcing, cryptocurrency, domain-specific language, drone strike, en.wikipedia.org, fault tolerance, hive mind, Jacob Appelbaum, Julian Assange, Mahatma Gandhi, Mitch Kapor, MITM: man-in-the-middle, Mohammed Bouazizi, nuclear winter, offshore financial centre, pattern recognition, profit motive, Ralph Nader, Richard Stallman, Robert Hanssen: Double agent, Silicon Valley, Silicon Valley ideology, Skype, social graph, statistical model, stem cell, Steve Jobs, Steve Wozniak, Steven Levy, undersea cable, Vernor Vinge, We are Anonymous. We are Legion, We are the 99%, WikiLeaks, X Prize, Zimmermann PGP
Those early PCs had to be assembled from kits, and learning to use them was often inextricable from learning to code. So a kindergarten-age Zatko acquired the ability to write software as naturally as most children learn to write their ABCs. At the same time, his parents introduced him to the violin and later the guitar; his talents on both sets of instruments, digital and analog, developed in parallel. When the Apple II was released, Zatko’s grandfather spent Zatko’s father’s entire inheritance to buy the sleek new machine for the family’s prodigy. Plugging into Steve Jobs and Steve Wozniak’s powerful creation, Zatko soon discovered video games, their annoying copyright protections, and the tantalizing task of picking those digital locks. “It’s 1978, I’m eight years old, and twenty dollars for a game is a lot of money,” says Zatko. “I can’t even make a backup copy. So I had to hack the systems, reverse engineer and disassemble them.
But Zatko’s friends from a decade later tell a somewhat different story, one of a young hacker who saw network defenses as speed bumps, and crossed enough of them to run afoul of the feds before the age of eighteen. In 1999, he told The New York Times Magazine that he had once received an informal warning from a “three letter agency.” Zatko claims he never knowingly broke computer laws as a teenager; he has no criminal record. The only souvenir that remains of his adventures on the edge of the law would be a long-confiscated Apple II PC that his colleagues say appeared in his office many years later, a well-preserved time capsule from a more anarchic period of Zatko’s and the Internet’s life. Fortunately for the young hacker, he also possessed less controversial talents. In 1988, Zatko was accepted to the Berklee College of Music in Boston and spent the next four years honing his guitar skills and composing music. After graduating at the top of his class, Mudge started work at a Boston computer graphics firm, joined a progressive rock band, and began attending a meet-up of hacker types the first Friday of every month at the Au Bon Pain across from Harvard Square’s chess tables.
Howard Rheingold by The Virtual Community Homesteading on the Electronic Frontier-Perseus Books (1993)
Apple II, Brewster Kahle, Buckminster Fuller, commoditize, conceptual framework, Douglas Engelbart, Douglas Engelbart, Electric Kool-Aid Acid Test, experimental subject, George Gilder, global village, Hacker Ethic, Haight Ashbury, Howard Rheingold, HyperCard, John Markoff, Kevin Kelly, knowledge worker, license plate recognition, loose coupling, Marshall McLuhan, Menlo Park, meta analysis, meta-analysis, Mitch Kapor, packet switching, Panopticon Jeremy Bentham, profit motive, RAND corporation, Ray Oldenburg, rent control, RFC: Request For Comment, Ronald Reagan, Saturday Night Live, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Ted Nelson, telepresence, The Great Good Place, The Hackers Conference, urban decay, Whole Earth Catalog, Whole Earth Review, young professional
They founded the Electronic Frontier Foundation (EFF) that afternoon in Barlow's Pinedale kitchen. 26-04-2012 21:46 howard rheingold's | the virtual community 17 de 36 http://www.rheingold.com/vc/book/9.html Within a few days, Kapor had put Barlow in touch with the distinguished constitutional law firm that had made it possible for the New York Times to publish the Pentagon Papers. Kapor, concerned about the nature of the Sun Devil arrests and what they signaled for civil liberties in cyberspace, offered to support the costs of legal defense. Acid Phreak, Phiber Optik, and their buddy Scorpion were represented by Rabinowitz, Boudin, Standard, Krinsky, and Lieberman. Within days of the Pinedale meeting, Steve Wozniak, cofounder of Apple Computer, and John Gilmore, Unix telecommunications wizard and one of the first employees of the enormously successful Sun Microsystems, offered to match Kapor's initial contributions. A board of directors was recruited that included, among others, WELL founder Stewart Brand. The EFF endowment was intended from the beginning to be a great deal more than a defense fund. The EFF founders saw, as the first reporters from the mass media did not, that Sun Devil was not just a hacker bust.
In 1981, he met an American woman, Gena, who later became his wife. Through Gena he became involved with Steve Plummer, the dean of students at the American University of Paris. Plummer and 26-04-2012 21:45 howard rheingold's | the virtual community 4 de 22 http://www.rheingold.com/vc/book/8.html another partner wanted to give personal computer users remote access to advanced programming languages. At that time, most PC users in France used the Apple II, a laughably puny antique by today's standards, with modems that transmitted information at 300 bits per second. But they had high hopes that the hardware would become more powerful in the future and they would have a growing business. Plummer and his partner needed a technically knowledgeable Parisian. Lumbroso came along, took a crash course in computers, and started to create an online system from scratch, in time for a big computer show.
You could see the transcript on your screen. We would do it for hours on end." The word began to get around the French Apple community, and by 1985, Calvados had grown to about three thousand users, and income was about $100,000 a month. In 1986, Steve Plummer found $2 million to finance the system outside the American University. They bought more powerful hardware, rewrote the software, and targeted the service not just to Apple II users but to all PC users. They designed the new version of their online service, which they renamed CalvaCom, using the metaphor of different "cities" to represent different forums. The Macintosh city, The PC city, and the Atari city were the major discussion areas. Like the first BBSs in the United States, CalvaCom was still frequented mostly by computer enthusiasts or people in the computer business, talking 26-04-2012 21:45 howard rheingold's | the virtual community 5 de 22 http://www.rheingold.com/vc/book/8.html mostly about computing.
The Best of 2600: A Hacker Odyssey by Emmanuel Goldstein
affirmative action, Apple II, call centre, don't be evil, Firefox, game design, Hacker Ethic, hiring and firing, information retrieval, John Markoff, late fees, license plate recognition, Mitch Kapor, MITM: man-in-the-middle, optical character recognition, packet switching, pirate software, place-making, profit motive, QWERTY keyboard, RFID, Robert Hanssen: Double agent, rolodex, Ronald Reagan, Silicon Valley, Skype, spectrum auction, statistical model, Steve Jobs, Steve Wozniak, Steven Levy, Telecommunications Act of 1996, telemarketer, undersea cable, Y2K
I am a modern hacker, but I’ve been interested in computers since I was a child in the early 1970s when “hack” meant “create” and not the current media corruption, which essentially translates to “destroy.” 94192c15.qxd 6/4/08 3:45 AM Page 619 Still More Hacker Stories This was a time when there were no visible computers and the government still decided who had ARPANET access. Around then, the first ads started appearing for Steve Jobs’ and Steve Wozniak’s Apple II—a useful configuration cost the same as taking a family to Europe (or the United States if you’re European). A real physical computer like the ones I saw in the magazines that taught me to program were simply out of the question. My only computer was imaginary. It existed only as a simulation in my head and in my notebook—the old fashioned paper kind. My computer programs were just lists of commands and parameters on paper, much like those programs of the first hacker Alan Turing, who hand simulated the world’s first chess program in the 1940s before the computers he fathered existed.
A good part of this issue is devoted to those matters and, as a result, many articles we were planning on running were bumped to the autumn issue. It would be nice if there was substantially less of this to report for our next issue. What is the EFF? (Summer, 1990) One of the results of our public outcry over the hacker raids this spring has been the formation of the Electronic Frontier Foundation (EFF). Founded by computer industry giants Mitch Kapor and Steve Wozniak along with writer John Barlow, the EFF 501 94192c13.qxd 6/3/08 3:34 PM Page 502 502 Chapter 13 sought to put an end to raids on publishers, bulletin board operators, and all of the others that have been caught up in recent events. The EFF founders, prior to the organization’s actual birth this summer, had said they would provide financial support to those affected by unjust Secret Service raids.
I don’t envy Kevin Mitnick for the ordeal he’s endured with the government. I think of myself as lucky to have never spent a day in jail. If I had, I don’t think I would have emerged a survivor. Quite honestly, I probably wouldn’t be here today. I don’t think this mark on my record, this felony, reflects with much accuracy what kind of person I am, or what kind of employee I am. Many youths do stupid things that aren’t necessarily injurious to anyone. Before Steve Wozniak and Steve Jobs co-founded Apple Computer, they “cheated” the phone company with a device called a “blue box” while in college at Berkeley, CA. Didn’t they turn into quasi-responsible multimillionaires? 94192c15.qxd 6/4/08 3:45 AM Page 633 Still More Hacker Stories “They didn’t get caught,” a landlord said to me, whose rental operation routinely turned away convicted felons per police sponsored programs.
Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing (Writing Science) by Thierry Bardini
Apple II, augmented reality, Bill Duvall, conceptual framework, Donald Davies, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, Grace Hopper, hiring and firing, hypertext link, index card, information retrieval, invention of hypertext, Jaron Lanier, Jeff Rulifson, John von Neumann, knowledge worker, Leonard Kleinrock, Menlo Park, Mother of all demos, new economy, Norbert Wiener, Norman Mailer, packet switching, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, RFC: Request For Comment, Sapir-Whorf hypothesis, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, stochastic process, Ted Nelson, the medium is the message, theory of mind, Turing test, unbiased observer, Vannevar Bush, Whole Earth Catalog
(Johnson et al. 1989, 25-26) The "other" personal computer revolution that redefined the idea of the per- sonal computer was indeed partly the result of the computing philosophy that had led to the Star and that had invented a personal user for the computer. The final stages of product development and marketing of the interface for the per- sonal computer, however, occurred at Apple, not at Xerox. Apple and The End of the Bootstrapping Process The fairy-tale story of the founding of Apple Computer by Steve Jobs and Steve Wozniak, beginning with the Apple I and the meetings of computer hobbyists at the Home Brew Computer Club in a Palo Alto burger joint, is often told and need not be repeated here. It is necessary, however, to trace the path followed by Douglas Engelbart's innovations as they reached their terminus in the form in which they now are employed, a form very different from the one Engelbart had envisioned for them.
I thank all the editors of these journals for granting me the permission to reprint and update parts of these publications. 1943-1946 1958 1964 ENIAC IBM 709 IBM SABRE 1951 CDC 1604 PDP-6 1948 UNIVAC MARK Whirlwind 1959 TX-O & TX-2 1949 1952 1960 EDSAC EDVAC IBM 7090 1950 1961 1953 IBM 1730 SEAC IBM 701 & 702 PDP I 1962 1954 ATLAS IBM 704 1963 PDP8 1956 LISP 1954 FORTRAN 1960 COBOL ALGOL 1963 CTSS Sketchpad ll BASIC PASCAL 1957 FORTRAN Compiler 1947 the transistor 1958 the integrdted circuit (lC) Developments in Computer Technology, 1943 - I 964 1969 NOVA 1971 IBM 370 MAXC 1973 ALTO 1981 IBM PC OSBORNE I XEROX STAR 1975 ALTAIR IMSAI 8080 1982 LISA 1979 TRS 80 NEC 8001 1983 COMPAQ portablc 1984 MACINTOSH IBM PC AT 1977 APPLE II 1975 ALTAIR BASIC 1983 WORD 1974 INTEL X080 1973 the floppy di...k drive BRAVO 1982 MULTIPLAN LOTUS 1-2-1 ADOBE 1971 INTEL 4004 & 800X 1970 ROM IC RAM IC 1979 VISICALC WORDSTAR WORDPERFECT D-BASE 1981 HITACHI LCD MS DOS QUICKDRA W Developments In Computer Technology, 1969-1984 (Computers are shown above the line; software and components, below) INTRODUCTION Douglas Engelbart's Crusade for the Augmentation of Human Intellect Journal entry 37.
Jobs foresaw the potential of such a technology for a marketable product. Two major factors influenced the success of the technology transfer of the graphic user interface from Xerox PARC to Apple. Jobs and Wozniak were connected to the hobbyist movement of the early 1970'S, and by 1979, Apple had successfully moved from this hobbyist market to the office market, thanks to Visicalc, the first spreadsheet program developed for the Apple II. "The two Steves-Jobs and Wozniak-they understood their market. The way they un- derstood their market was twofold: I) they were it, and that's the best way to understand a market, and 2) they just liked to go to the Homebrew Computer Club and having the neatest thing. . . The neatest thing available to that com- munity" (Belleville 1992). The first factor helped Jobs to realize a potential market existed for an in- dividually owned personal computer.
Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots by John Markoff
"Robert Solow", A Declaration of the Independence of Cyberspace, AI winter, airport security, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, basic income, Baxter: Rethink Robotics, Bill Duvall, bioinformatics, Brewster Kahle, Burning Man, call centre, cellular automata, Chris Urmson, Claude Shannon: information theory, Clayton Christensen, clean water, cloud computing, collective bargaining, computer age, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data acquisition, Dean Kamen, deskilling, don't be evil, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, Dynabook, Edward Snowden, Elon Musk, Erik Brynjolfsson, factory automation, From Mathematics to the Technologies of Life and Death, future of work, Galaxy Zoo, Google Glasses, Google X / Alphabet X, Grace Hopper, Gunnar Myrdal, Gödel, Escher, Bach, Hacker Ethic, haute couture, hive mind, hypertext link, indoor plumbing, industrial robot, information retrieval, Internet Archive, Internet of things, invention of the wheel, Jacques de Vaucanson, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, knowledge worker, Kodak vs Instagram, labor-force participation, loose coupling, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, medical residency, Menlo Park, Mitch Kapor, Mother of all demos, natural language processing, new economy, Norbert Wiener, PageRank, pattern recognition, pre–internet, RAND corporation, Ray Kurzweil, Richard Stallman, Robert Gordon, Rodney Brooks, Sand Hill Road, Second Machine Age, self-driving car, semantic web, shareholder value, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, skunkworks, Skype, social software, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Nelson, telemarketer, telepresence, telepresence robot, Tenerife airport disaster, The Coming Technological Singularity, the medium is the message, Thorstein Veblen, Turing test, Vannevar Bush, Vernor Vinge, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, William Shockley: the traitorous eight, zero-sum game
The Stanford Artificial Intelligence Laboratory quickly became a California haven for the same hacker sensibility that had spawned at MIT. Smart young computer hackers like Steve “Slug” Russell and Whitfield Diffie followed McCarthy west, and during the next decade and a half a startling array of hardware engineers and software designers would flow through the laboratory, which maintained its countercultural vibe even as McCarthy became politically more conservative. Both Steve Jobs and Steve Wozniak would hold on to sentimental memories of their visits as teenagers to the Stanford laboratory in the hills. SAIL would become a prism through which a stunning group of young technologists as well as full-blown industries would emerge. Early work in machine vision and robotics began at SAIL, and the laboratory was indisputably the birthplace of speech recognition. McCarthy gave Raj Reddy his thesis topic on speech understanding, and Reddy went on to become the seminal researcher in the field.
The idea was that unskilled users would be able to retrieve information by posing queries in normal sentences. There was no money, only a promise of stock if the project took off. Kaplan’s expertise was on natural language front ends that would allow typed questions to an expert system. What Hendrix needed, however, was a simple database back end for his demonstration. And so over a Christmas holiday at the end of 1980, Kaplan sat down and programmed one. The entire thing initially ran on an Apple II. He did it on a contingent basis and in fact he didn’t get rich. The first Symantec never went anywhere commercially and the venture capitalists did a “cram down,” a financial maneuver in which company founders often see their equity lose value in exchange for new investments. As a result, what little stock Kaplan owned was now worthless. In the end he left Stanford and joined Teknowledge because he admired Lee Hecht, the University of Chicago physicist and business school professor who had been brought in to be CEO and provide adult supervision for the twenty Stanford AI refugees who were the Teknowledge shock troops.
Coastal California Travel Guide by Lonely Planet
1960s counterculture, Airbnb, airport security, Albert Einstein, anti-communist, Apple II, Asilomar, back-to-the-land, Bay Area Rapid Transit, Burning Man, buy and hold, California gold rush, call centre, car-free, carbon footprint, Donner party, East Village, El Camino Real, Electric Kool-Aid Acid Test, flex fuel, Frank Gehry, glass ceiling, Golden Gate Park, Haight Ashbury, haute couture, haute cuisine, income inequality, intermodal, Joan Didion, Kickstarter, Loma Prieta earthquake, low cost airline, Lyft, Mason jar, New Journalism, ride hailing / ride sharing, Ronald Reagan, Rosa Parks, Saturday Night Live, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, starchitect, stealth mode startup, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, trade route, transcontinental railway, uber lyft, Upton Sinclair, upwardly mobile, urban sprawl, Wall-E, white picket fence, Whole Earth Catalog, women in the workforce, working poor, Works Progress Administration, young professional, Zipcar
When Hewlett-Packard introduced the first personal computer in 1968, advertisements breathlessly gushed that the ‘light’ (40lb) machine could ‘take on roots of a fifth-degree polynomial, Bessel functions, elliptic integrals and regression analysis’ – all for just $4900 (almost $35,000 today). Consumers didn’t know quite what to do with computers, but in his 1969 Whole Earth Catalog, author (and former LSD tester for the CIA) Stewart Brand explained that the technology governments used to run countries could empower ordinary people. Hoping to bring computer power to the people, 21-year-old Steve Jobs and Steve Wozniak introduced the Apple II at the 1977 West Coast Computer Faire. Still, the question remained: what would ordinary people do with all that computing power? By the mid-1990s an entire dot-com industry boomed in Silicon Valley with online start-ups, and suddenly people were getting everything – their mail, news, politics, pet food and, yes, sex – online. But when dot-com profits weren’t forthcoming, venture-capital funding dried up and fortunes in stock options disappeared when the dot-com bubble burst and the Nasdaq Stock Market plummeted on March 10, 2000.
SAN JOSE FOR CHILDREN Children’s Discovery MuseumMUSEUM ( GOOGLE MAP ; %408-298-5437; www.cdm.org; 180 Woz Way; $13, child under 1yr free; h10am-5pm Tue-Sat, noon-5pm Sun; c) Downtown, this science and creativity museum has hands-on displays incorporating art, technology and the environment, with plenty of toys and cool play-and-learn areas for tots to school-aged children. The museum is on Woz Way, named after Steve Wozniak, cofounder of Apple. California's Great AmericaAMUSEMENT PARK ( GOOGLE MAP ; %408-988-1776; www.cagreatamerica.com; 4701 Great America Pkwy, Santa Clara; adult/child under 4ft $69/48; hApr-Oct, hours vary; c) If you can handle the shameless product placements, kids love the roller coasters and other thrill rides. Online tickets cost much less than walk-up prices. Parking is $15 to $25, but the park is also accessible by public transportation.
Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia by Anthony M. Townsend
1960s counterculture, 4chan, A Pattern Language, Airbnb, Amazon Web Services, anti-communist, Apple II, Bay Area Rapid Transit, Burning Man, business process, call centre, carbon footprint, charter city, chief data officer, clean water, cleantech, cloud computing, computer age, congestion charging, connected car, crack epidemic, crowdsourcing, DARPA: Urban Challenge, data acquisition, Deng Xiaoping, digital map, Donald Davies, East Village, Edward Glaeser, game design, garden city movement, Geoffrey West, Santa Fe Institute, George Gilder, ghettoisation, global supply chain, Grace Hopper, Haight Ashbury, Hedy Lamarr / George Antheil, hive mind, Howard Rheingold, interchangeable parts, Internet Archive, Internet of things, Jacquard loom, Jane Jacobs, jitney, John Snow's cholera map, Joi Ito, Khan Academy, Kibera, Kickstarter, knowledge worker, load shedding, M-Pesa, Mark Zuckerberg, megacity, mobile money, mutually assured destruction, new economy, New Urbanism, Norbert Wiener, Occupy movement, off grid, openstreetmap, packet switching, Panopticon Jeremy Bentham, Parag Khanna, patent troll, Pearl River Delta, place-making, planetary scale, popular electronics, RFC: Request For Comment, RFID, ride hailing / ride sharing, Robert Gordon, self-driving car, sharing economy, Silicon Valley, Skype, smart cities, smart grid, smart meter, social graph, social software, social web, special economic zone, Steve Jobs, Steve Wozniak, Stuxnet, supply-chain management, technoutopianism, Ted Kaczynski, telepresence, The Death and Life of Great American Cities, too big to fail, trade route, Tyler Cowen: Great Stagnation, undersea cable, Upton Sinclair, uranium enrichment, urban decay, urban planning, urban renewal, Vannevar Bush, working poor, working-age population, X Prize, Y2K, zero day, Zipcar
The Altair used the same Intel 8080 microprocessor and sold as a kit for less than $400. But you had to put the thing together yourself.12 Hobbyists quickly formed groups like Silicon Valley’s Homebrew Computer Club to trade tips, hacks, and parts for these DIY computers. Homebrew was a training camp for innovators like Apple cofounders Steve Jobs and Steve Wozniak who would overthrow IBM’s dominance of the computer industry. (According to Wozniak, the Apple I and Apple II were demo’d at Homebrew meetings repeatedly during their development.)13 Never before had so much computing power been put in the hands of so many. Grassroots smart-city technologies—mobile apps, community wireless networks, and open-source microcontrollers among them—are following a similar trajectory as the PC: from utopian idea to geek’s plaything to mass market.
San Francisco by Lonely Planet
airport security, Albert Einstein, Apple II, back-to-the-land, banking crisis, Bay Area Rapid Transit, Burning Man, California gold rush, car-free, carbon footprint, centre right, Chuck Templeton: OpenTable:, David Brooks, Electric Kool-Aid Acid Test, G4S, game design, glass ceiling, Golden Gate Park, Haight Ashbury, Joan Didion, Loma Prieta earthquake, Mason jar, New Urbanism, Silicon Valley, South of Market, San Francisco, stealth mode startup, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, transcontinental railway, urban sprawl, Whole Earth Catalog, Zipcar
The message he typed was ‘L,’ then ‘O,’ then ‘G’ – at which point the computer crashed. Top 5 for Weird Technology Exploratorium (the Presidio) Musée Mécanique (Fisherman’s Wharf) Audium (Japantown) SFMOMA (SoMa) Children’s Creativity Museum (SoMa) The next wave of California techies was determined to create a personal computer that could compute and communicate without crashing. When 21-year-old Steve Jobs and Steve Wozniak introduced the Apple II at San Francisco’s West Coast Computer Faire in 1977, techies were abuzz about the memory (4KB of RAM!) and the microprocessor speed (1MHz!). The Mac II originally retailed for the equivalent today of $4300, or for 48KB of RAM, more than twice that amount – a staggering investment for what seemed like a glorified calculator/typewriter. Even if these computers could talk to one another, pundits reasoned, what would they talk about?
November 20, 1969 Native American activists reclaim the abandoned island of Alcatraz as reparation for broken treaties. The occupation lasts 19 months, until FBI agents forcibly oust the activists. December 6, 1969 A free concert at Altamont Speedway goes tragically wrong when Hells Angels on bodyguard duty turn on the performers and audience. Four people are killed . April 16–17, 1977 The Apple II is introduced in SF at the first West Coast Computer Faire, and stuns the crowd with its computing speed (1MHz) – current computers run 2000 to 3000 times faster. 1977 Harvey Milk becomes the first openly gay man elected to US public office. Milk sponsors a gay-rights bill and trend-setting ‘pooper-scooper’ ordinance before his murder by Dan White. November 18, 1978 After moving his People’s Temple from SF to Guyana, cult leader Jim Jones orders the murders of Congressman Leo Ryan and four journalists and mass suicide of 900 followers. 1981 The first cases of AIDS are identified.
Northern California Travel Guide by Lonely Planet
Airbnb, Apple II, Asilomar, back-to-the-land, Bay Area Rapid Transit, big-box store, Burning Man, buy and hold, California gold rush, call centre, car-free, carbon footprint, clean water, dark matter, Donald Trump, Donner party, East Village, El Camino Real, Electric Kool-Aid Acid Test, Frank Gehry, friendly fire, glass ceiling, Golden Gate Park, Google bus, Haight Ashbury, haute couture, haute cuisine, housing crisis, Joan Didion, Kickstarter, Loma Prieta earthquake, Lyft, Mahatma Gandhi, Mark Zuckerberg, Mason jar, McMansion, means of production, Port of Oakland, ride hailing / ride sharing, Ronald Reagan, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, stealth mode startup, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, the built environment, trade route, transcontinental railway, uber lyft, Upton Sinclair, urban sprawl, white picket fence, Whole Earth Catalog, women in the workforce, working poor, Works Progress Administration, young professional
Geeking Out When Silicon Valley introduced the first personal computer in 1968, advertisements breathlessly gushed that Hewlett-Packard’s ‘light’ (40lb) machine could ‘take on roots of a fifth-degree polynomial, Bessel functions, elliptic integrals and regression analysis’ – all for just $4900 (nearly $35,000 today). Consumers didn’t know quite what to do with computers, but in his 1969 Whole Earth Catalog, author (and former LSD tester for the CIA) Stewart Brand explained that the technology governments used to run countries could empower ordinary people. Hoping to bring computer power to the people, Steve Jobs and Steve Wozniak, both in their 20s at the time, introduced the Apple II at the 1977 West Coast Computer Faire, with unfathomable memory (4KB of RAM) and microprocessor speed (1MHz). Still, the question remained: what would ordinary people do with all that computing power? By the mid-1990s an entire dot-com industry boomed in Silicon Valley with online start-ups, and suddenly people were getting everything – their mail, news, politics, pet food and, yes, sex – online.
SAN JOSE FOR CHILDREN Children’s Discovery MuseumMUSEUM ( GOOGLE MAP ; %408-298-5437; www.cdm.org; 180 Woz Way; $13, child under 1yr free; h10am-5pm Tue-Sat, noon-5pm Sun; c) Downtown, this science and creativity museum has hands-on displays incorporating art, technology and the environment, with plenty of toys and cool play-and-learn areas for tots to school-aged children. The museum is on Woz Way, named after Steve Wozniak, cofounder of Apple. California's Great AmericaAMUSEMENT PARK ( GOOGLE MAP ; %408-988-1776; www.cagreatamerica.com; 4701 Great America Pkwy, Santa Clara; adult/child under 4ft $69/48; hApr-Oct, hours vary; c) If you can handle the shameless product placements, kids love the roller coasters and other thrill rides. Online tickets cost much less than walk-up prices. Parking is $15 to $25, but the park is also accessible by public transportation.
Architects of Intelligence by Martin Ford
3D printing, agricultural Revolution, AI winter, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, barriers to entry, basic income, Baxter: Rethink Robotics, Bayesian statistics, bitcoin, business intelligence, business process, call centre, cloud computing, cognitive bias, Colonization of Mars, computer vision, correlation does not imply causation, crowdsourcing, DARPA: Urban Challenge, deskilling, disruptive innovation, Donald Trump, Douglas Hofstadter, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, Fellow of the Royal Society, Flash crash, future of work, gig economy, Google X / Alphabet X, Gödel, Escher, Bach, Hans Rosling, ImageNet competition, income inequality, industrial robot, information retrieval, job automation, John von Neumann, Law of Accelerating Returns, life extension, Loebner Prize, Mark Zuckerberg, Mars Rover, means of production, Mitch Kapor, natural language processing, new economy, optical character recognition, pattern recognition, phenotype, Productivity paradox, Ray Kurzweil, recommendation engine, Robert Gordon, Rodney Brooks, Sam Altman, self-driving car, sensor fusion, sentiment analysis, Silicon Valley, smart cities, social intelligence, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, Ted Kaczynski, The Rise and Fall of American Growth, theory of mind, Thomas Bayes, Travis Kalanick, Turing test, universal basic income, Wall-E, Watson beat the top human players on Jeopardy!, women in the workforce, working-age population, zero-sum game, Zipcar
So, data availability is a big deal and may explain why some areas of AI applications take off much faster in some places than others. But we’ve also got other limitations to deal with, like we still don’t have generalized tools in AI and we still don’t know how to solve general problems in AI. In fact, one of the fun things, and you may have seen this, is that people are now starting to define new forms of what used to be the Turing test. MARTIN FORD: A new Turing Test? How would that work? JAMES MANYIKA: Steve Wozniak, the co-founder of Apple, has actually proposed what he calls the “coffee test” as opposed to Turing tests, which are very narrow in many respects. A coffee test is kind of fun: until you get a system that can enter an average and previously unknown American home and somehow figure out how to make a cup of coffee, we’ve not solved AGI. The reason why that sounds trivial but at the same time quite profound is because you’re solving a large number of unknowable and general problems in order to make that cup of coffee in an unknown home, where you don’t know where things are going to be, what type of coffee maker it is or other tools they have, etc.
I didn’t know the words “artificial intelligence” at the time, but I got very interested in the whole notion of coordinated intelligence from a mathematical, algorithmic, and philosophical perspective. I believed that modeling human intelligence in the machine was possible. There was no reason to think that it wasn’t. MARTIN FORD: Did you follow that with computer science at college? DAVID FERRUCCI: No, I had no idea about careers in computer science or AI, so I went to college and majored in biology to become a medical doctor. During my studies, I got my grandparents to buy me an Apple II computer, and I just started programming everything I could think of. I ended up programming a lot of software for my college, from graphing software for experimental lab work, to ecology simulation software, to analog-to-digital interfacing for lab equipment. This, of course, was before any of this stuff even existed, never mind being able to just download it form the internet. I decided to do as much computer science as I could in my last year of college, so I did a minor in it.
He was an early leader in computer vision and one of the founders of AAAI, the professional organization for AI in America. He also ran an early industry AI lab. Essentially, as a child I lived through the previous big wave of excitement in AI in the late 1970s and 1980s, which allowed me to go to AI conferences as a kid. We grew up in the Bay Area, and one-time my father took us to Southern California because there was an Apple AI conference taking place, and this was in the Apple II era. I remember that Apple had bought out Disneyland for the evening for all of the attendees of the big AI conference. So, we flew down for the day just to be able to go on Pirates of the Caribbean 13 times in a row, which, looking back, tells you something about just how big AI was even then. It’s hyped now, but it was the same back then. There were startups, there were big companies, and AI was going to change the world.
California by Sara Benson
airport security, Albert Einstein, Apple II, Asilomar, back-to-the-land, Bay Area Rapid Transit, Berlin Wall, Burning Man, buy and hold, California gold rush, call centre, car-free, carbon footprint, Columbine, dark matter, desegregation, Donald Trump, Donner party, East Village, El Camino Real, Electric Kool-Aid Acid Test, Frank Gehry, global village, Golden Gate Park, Haight Ashbury, haute cuisine, Joan Didion, Khyber Pass, Loma Prieta earthquake, low cost airline, McMansion, means of production, Menlo Park, planetary scale, RFID, ride hailing / ride sharing, Ronald Reagan, Silicon Valley, South of Market, San Francisco, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, the new new thing, trade route, transcontinental railway, Upton Sinclair, urban sprawl, Wall-E, white picket fence, Whole Earth Catalog, working poor, Works Progress Administration, young professional
The message he typed was ‘L’ then ‘O’ then ‘G’ – at which point the computer crashed. * * * To read more about the garage-workshop culture of Silicon Valley go to www.folklore.org, which covers the crashes and personality clashes that made geek history. * * * The next wave of Californian techies was determined to create a personal computer that could compute and communicate without crashing. When 21-year-old Steve Jobs and Steve Wozniak introduced the Apple II at the first West Coast Computer Faire in 1977, techies were abuzz about the memory (4KB of RAM!), the microprocessor speed (1MHz!) and a function straight out of science fiction: the ability to communicate directly with other computers, without a person in between. Not until the 1990s would Silicon Valley engineers finally give computers something to talk about. Start-up ventures rushed to be the first to grab users’ attention along the new ‘information superhighway,’ and suddenly people were getting their mail, news, politics, pet food and, yes, sex online.
The park is on Hwy 35 in the Santa Cruz mountains Click here, just south of the intersection with Hwy 9. San Jose for Children Downtown, the technology-focused Children’s Discovery Museum (Map; 408-298-5437; www.cdm.org; 180 Woz Way; admission $8; 10am-5pm Tue-Sat, noon-5pm Sun; ) has hands-on science and space displays, plenty of toys and some pretty cool play-and-learn areas such as the kooky ‘Alice’s Wonderland.’ The museum is on Woz Way, which is named after Steve Wozniak, the cofounder of Apple. On hot days, kids can cavort among the 22 minigeysers that climb and fall at the play fountain (Map; Plaza de Cesar Chavez) on downtown’s central square. Kids can romp and whoop it up at two nearby theme parks. Raging Waters ( 408-238-9900; 2333 South White Rd; adult/child under 48in $30/22; May-Sep; ), a water park inside Lake Cunningham Regional Park, has fast water slides, a tidal pool and a nifty water fort.
California courts voided those marriage contracts, but the civil rights challenges to differential treatment based on sexual orientation aren’t over. Geek Chic Industry dwindled steadily in San Francisco after World War II, but the brains of military-industrial operations found work in the so-called Silicon Valley, a technology-centric area that runs south of San Francisco to San Jose. A company started in a South Bay garage called Hewlett-Packard introduced the forward-thinking 9100A ‘computing genie’ in 1968, and the ground-breaking Apple II was introduced at San Francisco’s West Coast Computer Faire in 1977. By the mid-1990s the tech boom had expanded to include internet sites that sold vegan dog food, art by the yard and extra socks – until venture capital funding dried up and multimillion dollar dot-coms shrank into online oblivion. Stock-option paper fortunes disappeared in 2000, leaving service-sector employees and 26-year-old former vice-presidents alike without job prospects.
Free Speech: Ten Principles for a Connected World by Timothy Garton Ash
A Declaration of the Independence of Cyberspace, activist lawyer, Affordable Care Act / Obamacare, Andrew Keen, Apple II, Ayatollah Khomeini, battle of ideas, Berlin Wall, bitcoin, British Empire, Cass Sunstein, Chelsea Manning, citizen journalism, Clapham omnibus, colonial rule, crowdsourcing, David Attenborough, don't be evil, Donald Davies, Douglas Engelbart, Edward Snowden, Etonian, European colonialism, eurozone crisis, failed state, Fall of the Berlin Wall, Ferguson, Missouri, Filter Bubble, financial independence, Firefox, Galaxy Zoo, George Santayana, global village, index card, Internet Archive, invention of movable type, invention of writing, Jaron Lanier, jimmy wales, John Markoff, Julian Assange, Mark Zuckerberg, Marshall McLuhan, mass immigration, megacity, mutually assured destruction, national security letter, Nelson Mandela, Netflix Prize, Nicholas Carr, obamacare, Peace of Westphalia, Peter Thiel, pre–internet, profit motive, RAND corporation, Ray Kurzweil, Ronald Reagan, semantic web, Silicon Valley, Simon Singh, Snapchat, social graph, Stephen Hawking, Steve Jobs, Steve Wozniak, The Death and Life of Great American Cities, The Wisdom of Crowds, Turing test, We are Anonymous. We are Legion, WikiLeaks, World Values Survey, Yom Kippur War
The influence of a Steve Jobs or a Mark Zuckerberg on their respective empires has been more like that of an idiosyncratic absolute ruler in some mediaeval principate than that of the head of government in a modern liberal democracy. Apple’s tethered perfectionism has everything to do with Jobs’s personality. If the other Apple-founding Steve—Wozniak—had become Apple’s dominant figure, it might have remained the open, generative platform it was at the time of the 1982 Apple II desktop computer. For years, Google did not allow advertisements for cigarettes and hard liquor because Sergey Brin and Larry Page disapproved of them. Facebook’s insistence on people using their real names is, to a significant degree, a result of Zuckerberg’s personal attitude. If you have ever doubted the role of the individual in history, just look behind your screen.
Americana: A 400-Year History of American Capitalism by Bhu Srinivasan
activist fund / activist shareholder / activist investor, American ideology, Apple II, Apple's 1984 Super Bowl advert, bank run, barriers to entry, Berlin Wall, blue-collar work, Bob Noyce, Bonfire of the Vanities, British Empire, business cycle, buy and hold, California gold rush, Charles Lindbergh, collective bargaining, commoditize, corporate raider, cuban missile crisis, Deng Xiaoping, diversification, diversified portfolio, Douglas Engelbart, financial innovation, fixed income, Ford paid five dollars a day, global supply chain, Gordon Gekko, Haight Ashbury, hypertext link, income inequality, invisible hand, James Watt: steam engine, Jane Jacobs, Jeff Bezos, John Markoff, joint-stock company, joint-stock limited liability company, Kickstarter, laissez-faire capitalism, Louis Pasteur, Marc Andreessen, Menlo Park, mortgage debt, mutually assured destruction, Norman Mailer, oil rush, peer-to-peer, pets.com, popular electronics, profit motive, race to the bottom, refrigerator car, risk/return, Ronald Reagan, Sand Hill Road, self-driving car, shareholder value, side project, Silicon Valley, Silicon Valley startup, Steve Ballmer, Steve Jobs, Steve Wozniak, strikebreaker, Ted Nelson, The Death and Life of Great American Cities, the new new thing, The Predators' Ball, The Wealth of Nations by Adam Smith, trade route, transcontinental railway, traveling salesman, Upton Sinclair, Vannevar Bush, Works Progress Administration, zero-sum game
The pair quickly hired several of their programmer friends to move into Allen’s apartment for the summer, with Gates joining when spring classes ended at Harvard. When September came around, Gates decided to stay. The Altair, at the time, was causing significant excitement among early adopters—enthusiasts known as hobbyists—including some in Silicon Valley. At the Homebrew Computer Club, formed just weeks after the Popular Electronics issue premiered the Altair, Steve Wozniak made it to the first meeting, which he would later call “one of the most important nights” of his life. There he saw the Altair demonstrated, beginning to understand the implications and flexibility of low-cost microprocessors. Wozniak spent the next few months hacking together the functions of a keyboard and a microprocessor that could show keystrokes on a television screen—which, in Wozniak’s telling, was an achievement that had never been accomplished outside large corporate mainframes costing tens or often hundreds of thousands of dollars.
Markkula, a young marketing veteran from pre-IPO Intel who had made a fortune from its climbing stock, agreed to provide $250,000 through a combination of a credit line and an equity investment. In turn Markkula received a one-third share, equal to those of Jobs and Wozniak, and turned their entity into a legal corporation. Through the first nine months of 1977, Apple sold 570 units of a new and improved product, the Apple II. Markkula then helped raise a little over $500,000. The bulk of this round, however, came not from Silicon Valley but from the venture capital arm of the Rockefeller family—a small piece of the great oil fortune had circulated into Intel and now into Apple. In 1978 sales grew to $8 million. Then $48 million. In 1980 sales exceeded $100 million for a company that had yet to see its four-year anniversary.
Western USA by Lonely Planet
airport security, Albert Einstein, Apple II, Asilomar, back-to-the-land, Bay Area Rapid Transit, Burning Man, California gold rush, call centre, car-free, carbon footprint, Charles Lindbergh, Chuck Templeton: OpenTable:, Donner party, East Village, edge city, Electric Kool-Aid Acid Test, Frank Gehry, global village, Golden Gate Park, Haight Ashbury, haute couture, haute cuisine, illegal immigration, intermodal, Joan Didion, Kickstarter, Loma Prieta earthquake, Mahatma Gandhi, Mars Rover, Maui Hawaii, off grid, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, starchitect, stealth mode startup, stem cell, Steve Jobs, Steve Wozniak, supervolcano, trade route, transcontinental railway, Upton Sinclair, urban planning, women in the workforce, Works Progress Administration, young professional, Zipcar
GEEKING OUT When California’s Silicon Valley introduced the first personal computer in 1968, advertisements breathlessly gushed that Hewlett-Packard’s ‘light’ (40lb) machine could ‘take on roots of a fifth-degree polynomial, Bessel functions, elliptic integrals and regression analysis’ – all for just $4900 (about $29,000 today). Hoping to bring computer power to the people, 21-year-old Steve Jobs and Steve Wozniak introduced the Apple II at the 1977 West Coast Computer Faire with unfathomable memory (4KB of RAM) and microprocessor speed (1MHz). By the mid-1990s, an entire dot-com industry boomed in Silicon Valley with online start-ups, and suddenly people were getting their mail, news, politics, pet food and, yes, sex online. But when dot-com profits weren’t forthcoming, venture funding dried up, and fortunes in stock-options disappeared on one nasty Nasdaq-plummeting day: March 10, 2000.
The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War (The Princeton Economic History of the Western World) by Robert J. Gordon
"Robert Solow", 3D printing, Affordable Care Act / Obamacare, airline deregulation, airport security, Apple II, barriers to entry, big-box store, blue-collar work, business cycle, Capital in the Twenty-First Century by Thomas Piketty, Charles Lindbergh, clean water, collective bargaining, computer age, creative destruction, deindustrialization, Detroit bankruptcy, discovery of penicillin, Donner party, Downton Abbey, Edward Glaeser, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, feminist movement, financial innovation, full employment, George Akerlof, germ theory of disease, glass ceiling, high net worth, housing crisis, immigration reform, impulse control, income inequality, income per capita, indoor plumbing, industrial robot, inflight wifi, interchangeable parts, invention of agriculture, invention of air conditioning, invention of the sewing machine, invention of the telegraph, invention of the telephone, inventory management, James Watt: steam engine, Jeff Bezos, jitney, job automation, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, labor-force participation, Loma Prieta earthquake, Louis Daguerre, Louis Pasteur, low skilled workers, manufacturing employment, Mark Zuckerberg, market fragmentation, Mason jar, mass immigration, mass incarceration, McMansion, Menlo Park, minimum wage unemployment, mortgage debt, mortgage tax deduction, new economy, Norbert Wiener, obamacare, occupational segregation, oil shale / tar sands, oil shock, payday loans, Peter Thiel, pink-collar, Productivity paradox, Ralph Nader, Ralph Waldo Emerson, refrigerator car, rent control, Robert X Cringely, Ronald Coase, school choice, Second Machine Age, secular stagnation, Skype, stem cell, Steve Jobs, Steve Wozniak, Steven Pinker, The Market for Lemons, The Rise and Fall of American Growth, Thomas Malthus, total factor productivity, transaction costs, transcontinental railway, traveling salesman, Triangle Shirtwaist Factory, undersea cable, Unsafe at Any Speed, Upton Sinclair, upwardly mobile, urban decay, urban planning, urban sprawl, washing machines reduced drudgery, Washington Consensus, Watson beat the top human players on Jeopardy!, We wanted flying cars, instead we got 140 characters, working poor, working-age population, Works Progress Administration, yellow journalism, yield management
These were Paul Allen, a recent Washington State University dropout, and Bill Gates, who would use his Altair money to drop out of Harvard. In April 1975, they together founded Microsoft. A month earlier, another computer legend, Steve Wozniak, had attended the Homebrew Computer Club in California and was immediately inspired by what he saw. He began work on his own personal computer, the Apple Computer or the Apple I, and got his friend, Steve Jobs, to help with sales. The computer had more memory and a cheaper microprocessor than the Altair and could be plugged into any television to use as a screen. Soon Jobs and Wozniak began work on the Apple II, which included a keyboard and a color screen as well as an external cassette tape (soon replaced with floppy disks). It was IBM’s first personal computer (PC), introduced in 1981, that revolutionized the market and soon made the Wang minicomputer and the memory typewriter obsolete.