165 results back to index
Valley of Genius: The Uncensored History of Silicon Valley (As Told by the Hackers, Founders, and Freaks Who Made It Boom) by Adam Fisher
Airbnb, Albert Einstein, AltaVista, Apple II, Apple's 1984 Super Bowl advert, augmented reality, autonomous vehicles, Bob Noyce, Brownian motion, Buckminster Fuller, Burning Man, Byte Shop, cognitive dissonance, disintermediation, don't be evil, Donald Trump, Douglas Engelbart, Dynabook, Elon Musk, frictionless, glass ceiling, Hacker Ethic, Howard Rheingold, HyperCard, hypertext link, index card, informal economy, information retrieval, Jaron Lanier, Jeff Bezos, Jeff Rulifson, John Markoff, Jony Ive, Kevin Kelly, Kickstarter, knowledge worker, life extension, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Maui Hawaii, Menlo Park, Metcalfe’s law, Mother of all demos, move fast and break things, move fast and break things, Network effects, new economy, nuclear winter, PageRank, Paul Buchheit, paypal mafia, peer-to-peer, Peter Thiel, pets.com, pez dispenser, popular electronics, random walk, risk tolerance, Robert Metcalfe, rolodex, self-driving car, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, social graph, social web, South of Market, San Francisco, Startup school, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, telerobotics, The Hackers Conference, the new new thing, Tim Cook: Apple, tulip mania, V2 rocket, Whole Earth Catalog, Whole Earth Review, Y Combinator
And we were at Stanford Linear Accelerator Center one night, and way in the bowels of their technical library, way down at the last bookshelf in the corner bottom rack, we found an AT&T technical journal that laid out the whole thing. And that’s another moment I’ll never forget. When we saw this journal, we thought, My God, it’s all real! Steve Wozniak: So I designed this little box and Steve said, “Oh, let’s sell it.” R. U. Sirius: And then Jobs and Woz manufactured those blue boxes. Captain Crunch: And I met Steve Wozniak, Steve Jobs, and one or two more individuals who were UC Berkeley students. Steve Wozniak was going there for his engineering degree. Steve Wozniak: We both sold it to people in the dorms for a year. Ron Rosenbaum: It was the beginning of the Apple partnership, even though as far as I can tell they weren’t very good at it then. Steve Jobs: We built the best blue box in the world! It was all digital. Captain Crunch: Their blue boxes were not pure.
The deal was that they got a bonus. Steve Wozniak: They paid Steve Jobs and then he paid me half the money, supposedly. He told me that we would get paid seven hundred bucks. Nolan Bushnell: It was about five grand. Steve Wozniak: Then he wrote me a check for 350. So, whatever. Steve should have been more open and honest with me. He should have told me differently because we were such close friends. But the fun of doing it overrides anything like that. Who cares about money? Right after we finished the game he went up to Oregon, and bought into that orchard or whatever it was. Dan Kottke: It was apple harvest time and Steve may have stayed there longer than me, but I was there for like a week doing the apple harvest. We were fasting on apples. It was our fruitarian experiment. And that’s why the name Apple was in the air.
Now that song was worth ninety-nine cents. It disaggregated the album. Guy Bar-Nahum: iPod basically freed Apple from the PC wars. Apple was entrenched in this losing war. It was like World War One: suffering and bleeding and losing territory and losing cultural relevance until iPod came and opened up the big sky of the universal app that is music. Today it looks small, but back then it was infinitely bigger. Music crosses cultures, and operating systems, and everything that was limiting the growth of Apple. It was a big thing. Steve Wozniak: Apple didn’t grow in size ever over the Apple II days until the iPod. And it didn’t grow in size when he introduced the iMac. It all started with the iPod—and it was the openness. Sanjeev Kumar: It brought Apple back from the brink. Andy Hertzfeld: They were making literally billions every week from the handheld music players.
Revolution in the Valley: The Insanely Great Story of How the Mac Was Made by Andy Hertzfeld
He was instrumental in convincing Burrell to switch from the 6809 to the 68000 microprocessor, which turned Jef’s research project into the future of Apple. A year and a half later, in December 1981, he had to leave the project to return to finish his M.D./Ph.D. degree, but he eventually returned to Apple in the summer of 1984. He left Apple to co-found NeXT with Steve Jobs in September 1985, and after a seven-year stint at Sun and year and a half at Eazel, he returned to Apple as a vice president of software technology in January 2002. Steve Wozniak Steve Wozniak co-founded Apple Computer with Steve Jobs in 1976. His brilliant design for the hardware and software of the Apple II created the foundation for Apple’s initial success. While he didn’t work directly on the original Macintosh, his engineering genius, impeccable integrity and playful sense of humor were a primary inspiration for the Macintosh team.
Eventually, I became so obsessed with the Apple II that I had to go to work at the place that created it. I abandoned graduate school and started work as a systems programmer at Apple in August 1979. Even though the Apple II was overflowing with both technical and marketing genius, the best thing about it was the spirit of its creation. It was not conceived or designed as a commercial product in the usual sense. Apple cofounder Steve Wozniak was just trying to make a great computer for himself and impress his friends at the Homebrew Computer Club. His design somehow projected an audacious sense of infinite horizons, as if the Apple II could do anything, if you were just clever enough. Most of the early Apple employees were their own ideal customers. The Apple II was simultaneously a work of art and the fulfillment of a dream, shared by Apple’s employees and customers.
The Apple II was simultaneously a work of art and the fulfillment of a dream, shared by Apple’s employees and customers. Its unique spirit was picked up and echoed back by third-party developers, who sprung out of nowhere with innovative applications. Making the transition from an ardent Apple II hobbyist to an Apple employee was like ascending Mount Olympus, walking among the gods, working alongside my heroes. The early team at Apple was full of amazing individuals, people like Steve Wozniak, Rod Holt, and Mike Markkula. It was a privilege to get to know them and learn the company mythology firsthand. Apple’s other co-founder, Steve Jobs, had no shortage of vision or ambition. Flush with the rapidly growing success of the Apple II, Apple initiated two new projects in the fall of 1978 (codenamed Sara and Lisa), which were aimed beyond the hobbyist market.
Hackers: Heroes of the Computer Revolution - 25th Anniversary Edition by Steven Levy
air freight, Apple II, Bill Gates: Altair 8800, Buckminster Fuller, Byte Shop, computer age, computer vision, corporate governance, Donald Knuth, El Camino Real, game design, Hacker Ethic, hacker house, Haight Ashbury, John Conway, John Markoff, Mark Zuckerberg, Menlo Park, non-fiction novel, Norman Mailer, Paul Graham, popular electronics, RAND corporation, reversible computing, Richard Stallman, Silicon Valley, software patent, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, The Hackers Conference, Whole Earth Catalog, Y Combinator
The Apple ad even said, “our philosophy is to provide software for our machines free or at minimal cost.” While the selling was going on, Steve Wozniak began working on an expanded design of the board, something that would impress his Homebrew peers even more. Steve Jobs had plans to sell many computers based on this new design, and he started getting financing, support, and professional help for the day the product would be ready. The new version of Steve Wozniak’s computer would be called the Apple II, and at the time no one suspected that it would become the most important computer in history. • • • • • • • • It was the fertile atmosphere of Homebrew that guided Steve Wozniak through the incubation of the Apple II. The exchange of information, the access to esoteric technical hints, the swirling creative energy, and the chance to blow everybody’s mind with a well-hacked design or program . . . these were the incentives which only increased the intense desire Steve Wozniak already had: to build the kind of computer he wanted to play with.
Most of the Apples at that time used cassette recorders; the difficulty of using an assembler with a cassette recorder made it nearly impossible to go down into the deepest recess of the machine, the 6502 chip, to speak in the Apple’s assembly language. This was changing: Steve Wozniak had recently hacked a brilliant design for a disk-drive interface for the Apple, and the company was able to offer low-cost floppy-disk drives which accessed thousands of bytes a second, making assembling easy for those few who knew how to program on that difficult level. Those infected with the Hands-On Imperative, of course, would soon join that elite in learning the system at its most primal level. Programmers, would-be programmers, and even users buying Apples would invariably purchase disk drives along with them. Since Steve Wozniak’s Apple adhered to the Hacker Ethic in that it was a totally “open” machine, with an easily available reference guide that told you where everything was on the chip and the motherboard, the Apple was an open invitation to roll your sleeves up and get down to the hexadecimal code of machine level.
He was guided in this by the experienced hand of Mike Markkula, who was taking the Apple venture very seriously. One thing he apparently recognized was that Steve Wozniak’s commitment was to the computer rather than to the company. To Woz, the Apple was a brilliant hack, not an investment. It was his art, not his business. He got his payment by solving puzzles, saving chips, impressing people at Homebrew. This was fine for hacking, but Markkula wanted, at the least, Wozniak’s full-time participation in the company. He told Jobs to tell his partner that if Woz wanted there to be an Apple Computer company, he must quit HP for all-out work on pre-production of the Apple II. It was a tough decision for Wozniak. “This was different than the year we spent throwing the Apple I together in the garage,” Wozniak later recalled. “This was a real company.
Fire in the Valley: The Birth and Death of the Personal Computer by Michael Swaine, Paul Freiberger
1960s counterculture, Amazon Web Services, Apple II, barriers to entry, Bill Gates: Altair 8800, Byte Shop, cloud computing, commoditize, computer vision, Douglas Engelbart, Douglas Engelbart, Dynabook, Google Chrome, I think there is a world market for maybe five computers, Internet of things, Isaac Newton, Jaron Lanier, job automation, John Markoff, John von Neumann, Jony Ive, Loma Prieta earthquake, Marc Andreessen, Menlo Park, Mitch Kapor, Mother of all demos, Paul Terrell, popular electronics, Richard Stallman, Robert Metcalfe, Silicon Valley, Silicon Valley startup, stealth mode startup, Steve Ballmer, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, Tim Cook: Apple, urban sprawl, Watson beat the top human players on Jeopardy!, Whole Earth Catalog
One day he sneaked into a programmer’s cubicle and placed a mouse inside his computer. When the programmer returned, it took him more than a few minutes to figure out why his Apple was squeaking. Meanwhile, without the singular vision of a Steve Wozniak, the Apple III project was floundering. Delays in the Apple III were soon causing concern in the marketing department. The young company was beginning to feel growing pains at last. When Apple was formed, the Apple II was already near completion. The Apple III was the first computer that Apple—as a company—had designed and built from scratch. The Apple III was also the first Apple not conceived by Steve Wozniak in pursuit of his personal dream machine. Instead the Apple III was a bit of a hodgepodge, pasted together by many hands and designed by committee. And, as is typical of anything created by committee, the various members weren’t completely happy with the results.
The Debut The young company faced a more modest challenge than tackling the company that had defined computer for generations: they had to finish the Apple II design in time for Jim Warren’s first West Coast Computer Faire in April and get it ready for production shortly thereafter. Markkula was already signing up distributors nationwide, many of whom were eager to work with a company that would give them greater freedom than microcomputer manufacturer MITS had, as well as provide a product that actually did something. * * * Figure 62. Steve Wozniak Woz scrambles for a phone in one of Apple’s early offices. (Courtesy of Margaret Kern Wozniak) Steve Wozniak is justly credited with the technical design of the Apple I and Apple II. Nevertheless, an essential contribution to making the Apple II a commercial success came from Jobs. Early microcomputers were typically drab and ugly metal boxes.
The design was so simple that he could describe it in just one page and anyone who read the description could duplicate his design. The consummate hobbyist, Woz believed in sharing information. The other hobbyists were duly impressed. Some questioned his choice of processor, but no one argued with the processor’s $20 price tag. He called his machine an Apple. * * * Figure 57. The Apple I Steve Wozniak’s original Apple I was a circuit board. (Courtesy of Apple Computer Inc.) The Apple I had only the bare essentials. It lacked a case, a keyboard, and a power supply. The hobbyist owner had to connect a transformer to it in order to get it to work. The Apple I also required laborious assembly by hand. Woz spent a lot of time helping friends implement his design. Steve Jobs saw a great financial opportunity in this skeletal machine, and urged Woz to start a company with him. Woz reluctantly agreed.
Steve Jobs by Walter Isaacson
air freight, Albert Einstein, Apple II, Apple's 1984 Super Bowl advert, big-box store, Bob Noyce, Buckminster Fuller, Byte Shop, centre right, Clayton Christensen, cloud computing, commoditize, computer age, computer vision, corporate governance, death of newspapers, don't be evil, Douglas Engelbart, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, fixed income, game design, Golden Gate Park, Hacker Ethic, hiring and firing, Jeff Bezos, Johannes Kepler, John Markoff, Jony Ive, lateral thinking, Mark Zuckerberg, Menlo Park, Mitch Kapor, Mother of all demos, Paul Terrell, profit maximization, publish or perish, Richard Feynman, Robert Metcalfe, Robert X Cringely, Ronald Reagan, Silicon Valley, skunkworks, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, supply-chain management, thinkpad, Tim Cook: Apple, Wall-E, Whole Earth Catalog
Markoff, xii; Stewart Brand, “We Owe It All to the Hippies,” Time, Mar. 1, 1995; Jobs, Stanford commencement address; Fred Turner, From Counterculture to Cyberculture (Chicago, 2006). The Homebrew Computer Club: Interviews with Steve Jobs, Steve Wozniak. Wozniak, 152–172; Freiberger and Swaine, 99; Linzmayer, 5; Moritz, 144; Steve Wozniak, “Homebrew and How Apple Came to Be,” www.atariarchives.org; Bill Gates, “Open Letter to Hobbyists,” Feb. 3, 1976. Apple Is Born: Interviews with Steve Jobs, Steve Wozniak, Mike Markkula, Ron Wayne. Steve Jobs, address to the Aspen Design Conference, June 15, 1983, tape in Aspen Institute archives; Apple Computer Partnership Agreement, County of Santa Clara, Apr. 1, 1976, and Amendment to Agreement, Apr. 12, 1976; Bruce Newman, “Apple’s Lost Founder,” San Jose Mercury News, June 2, 2010; Wozniak, 86, 176–177; Moritz, 149–151; Freiberger and Swaine, 212–213; Ashlee Vance, “A Haven for Spare Parts Lives on in Silicon Valley,” New York Times, Feb. 4, 2009; Paul Terrell interview, Aug. 1, 2008, mac-history.net.
Steve Jobs, address to the Aspen Design Conference, June 15, 1983, tape in Aspen Institute archives; Apple Computer Partnership Agreement, County of Santa Clara, Apr. 1, 1976, and Amendment to Agreement, Apr. 12, 1976; Bruce Newman, “Apple’s Lost Founder,” San Jose Mercury News, June 2, 2010; Wozniak, 86, 176–177; Moritz, 149–151; Freiberger and Swaine, 212–213; Ashlee Vance, “A Haven for Spare Parts Lives on in Silicon Valley,” New York Times, Feb. 4, 2009; Paul Terrell interview, Aug. 1, 2008, mac-history.net. Garage Band: Interviews with Steve Wozniak, Elizabeth Holmes, Daniel Kottke, Steve Jobs. Wozniak, 179–189; Moritz, 152–163; Young, 95–111; R. S. Jones, “Comparing Apples and Oranges,” Interface, July 1976. CHAPTER 6: THE APPLE II An Integrated Package: Interviews with Steve Jobs, Steve Wozniak, Al Alcorn, Ron Wayne. Wozniak, 165, 190–195; Young, 126; Moritz, 169–170, 194–197; Malone, v, 103. Mike Markkula: Interviews with Regis McKenna, Don Valentine, Steve Jobs, Steve Wozniak, Mike Markkula, Arthur Rock. Nolan Bushnell, keynote address at the ScrewAttack Gaming Convention, Dallas, July 5, 2009; Steve Jobs, talk at the International Design Conference at Aspen, June 15, 1983; Mike Markkula, “The Apple Marketing Philosophy” (courtesy of Mike Markkula), Dec. 1979; Wozniak, 196–199.
“Well, give me a hug,” he said. And so they hugged. But the biggest news that month was the departure from Apple, yet again, of its cofounder, Steve Wozniak. Wozniak was then quietly working as a midlevel engineer in the Apple II division, serving as a humble mascot of the roots of the company and staying as far away from management and corporate politics as he could. He felt, with justification, that Jobs was not appreciative of the Apple II, which remained the cash cow of the company and accounted for 70% of its sales at Christmas 1984. “People in the Apple II group were being treated as very unimportant by the rest of the company,” he later said. “This was despite the fact that the Apple II was by far the largest-selling product in our company for ages, and would be for years to come.” He even roused himself to do something out of character; he picked up the phone one day and called Sculley, berating him for lavishing so much attention on Jobs and the Macintosh division.
Insanely Great: The Life and Times of Macintosh, the Computer That Changed Everything by Steven Levy
Apple II, Apple's 1984 Super Bowl advert, computer age, conceptual framework, Douglas Engelbart, Douglas Engelbart, Dynabook, Howard Rheingold, HyperCard, information retrieval, information trail, John Markoff, Kickstarter, knowledge worker, Marshall McLuhan, Mitch Kapor, Mother of all demos, Productivity paradox, QWERTY keyboard, rolodex, Silicon Valley, skunkworks, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Ted Nelson, the medium is the message, Vannevar Bush
As a consequence of the company's success, Apple very quickly had to shift from a garage mentality to the mindset of a budding corporation--one valued, at the time of the PARC visit, at over a billion dollars. It filled several low-slung office buildings in Cupertino, and had hundreds of employees. Though the Apple II was wonderful for its time, Apple's leaders realized that the company needed new products to remain competitive. They began work on-the Apple III, a machine roughly as powerful as IBM's personal computer would be. But Steve Jobs had an idea for something even more special-Lisa, a computer that would leapfrog Apple's technology, surpassing not only the Apple II, but Apple III as well. This jump would also vault Apple a generation or so past anything that its competitors were preparing. Begun when Steve Wozniak, at Steve Jobs's request, sketched its architecture on a napkin, Lisa had, in less than a year, evolved to a computer based on the powerful Motorola 68000 microprocessor chip, and was engineered to handle more complicated applications, even running several at the same time, a trick called "multitasking."
In his nightmares, he churned out workmanlike code for creepy bosses in suits. Then he discovered the Apple II. "It changed my life," Andy told me on that first day we met. "The more I learned about it, the more I was impressed with its brilliance." He dropped out of graduate school and began writing Apple programs. One of his hacks filled a gap in the Apple II that Jef Raskin had first identified: it displayed only uppercase letters. His first impulse was to give the program away-in Andy Hertzfeld's mind, anything that helps people use a compurer more efficiently is a good in and of itself. But a friend convinced him to sell it, and Hertzfeld made $40,000 in a few months. Andy went to work for Apple in 1979. In some ways it was a dream; he had access to the secrets of the Apple II, and even began a friendship with his hero, Steve Wozniak. On the other hand, the company was just beginning its accommodation with hypergrowth, with some disturbing side effects.
Before coming to Apple, he had been a graduate student combining computer science and neurobiochemistry at the University of Washington, gaining distinction by concocting a graphics program . that interpreted CAT scans of the human brain. The stunning visuals produced by Atkinson's work allowed people to see the brain from previously unimagined vistas. Atkinson had experienced a conversion experience when he came across an Apple II in 1977. He easily saw past its limitations (it was much less powerful than the machines he worked with at school), instead appreciating the virtuosity of Steve Wozniak's digital design. He went to work for Apple in 1978-employee number 51-writing applications that would help sell the Apple II. But with Lisa, Atkinson faced his biggest challenge. His ninety-minute exposure to Smalltalk had been somewhat deceiving. While the computer in some ways seemed to have completely solved the challenge of allowing an unschooled worker easy access to the information inside the computer-the furniture of cyberspace-when Atkinson sat down and tried to duplicate the task, he realized that there were gaps as yet unfilled.
Exploding the Phone: The Untold Story of the Teenagers and Outlaws Who Hacked Ma Bell by Phil Lapsley
air freight, Apple II, Bill Gates: Altair 8800, Bob Noyce, card file, cuban missile crisis, dumpster diving, Hush-A-Phone, index card, Jason Scott: textfiles.com, John Markoff, Menlo Park, popular electronics, Richard Feynman, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, the new new thing, the scientific method, undersea cable, urban renewal, wikimedia commons
., 841 Broadway, New York, NY 10003 or email@example.com. Excerpt from IWOZ: COMPUTER GEEK TO CULT ICON: HOW I INVENTED THE PERSONAL COMPUTER, COFOUNDED APPLE, AND HAD FUN DOING IT by Steve Wozniak and Gina Smith. Copyright © 2006 by Steve Wozniak and Gina Smith. Used by permission of W. W. Norton & Company, Inc. ISBN-13: 978-0-8021-9375-9 Grove Press an imprint of Grove/Atlantic, Inc. 841 Broadway New York, NY 10003 Distributed by Publishers Group West www.groveatlantic.com 13 14 15 16 10 9 8 7 6 5 4 3 2 1 To the men and women of the Bell System, and especially to the members of the technical staff of Bell Laboratories, without whom none of this would have been possible CONTENTS FOREWORD BY STEVE WOZNIAK A NOTE ON NAMES AND TENSES CHAPTER 1 FINE ARTS 13 CHAPTER 2 BIRTH OF A PLAYGROUND CHAPTER 3 CAT AND CANARY CHAPTER 4 THE LARGEST MACHINE IN THE WORLD CHAPTER 5 BLUE BOX CHAPTER 6 “SOME PEOPLE COLLECT STAMPS” CHAPTER 7 HEADACHE CHAPTER 8 BLUE BOX BOOKIES CHAPTER 9 LITTLE JOJO LEARNS TO WHISTLE CHAPTER 10 BILL ACKER LEARNS TO PLAY THE FLUTE CHAPTER 11 THE PHONE FREAKS OF AMERICA PHOTO INSERT CHAPTER 12 THE LAW OF UNINTENDED CONSEQUENCES CHAPTER 13 COUNTERCULTURE CHAPTER 14 BUSTED CHAPTER 15 PRANKS CHAPTER 16 THE STORY OF A WAR CHAPTER 17 A LITTLE BIT STUPID CHAPTER 18 SNITCH CHAPTER 19 CRUNCHED CHAPTER 20 TWILIGHT CHAPTER 21 NIGHTFALL EPILOGUE SOURCES AND NOTES ACKNOWLEDGMENTS INDEX THE PLAYGROUND Phone phreak (n.) 1.
While there, Draper claims, he taught the art of phone phreaking to dozens of other inmates. Draper soon went to work for his friend Steve Wozniak at Apple Computer, designing an innovative product called the Charley Board. Charley was an add-in circuit board for the Apple II that connected the computer to the telephone line. With Charley and a few simple programs you could make your Apple II do all sorts of telephonic tricks. Not only could it dial telephone numbers and send touch tones down the line, it could even listen to the calls it placed and recognize basic telephone signals as the call progressed, signals such as a dial tone or busy signal or a ringing signal. With the right programming it could be used as a modem. An Apple II with a Charley Board, in fact, became the ultimate phone phreaking tool. Just as the phone company thought it was natural to mix computers and phone switches, John Draper thought it was natural to mix computers and phone phreaking.
Every one needed hardware and software hackers to help them. Riches, or promises of riches, or maybe just a fun job that might pay the bills beckoned. In 1976 former phone phreaks Steve Jobs and Steve Wozniak were selling Apple I computers to their fellow hobbyists. “Jobs placed ads in hobbyist publications and they began selling Apples for the price of $666.66,” journalist Steven Levy wrote. “Anyone in Homebrew could take a look at the schematics for the design, Woz’s BASIC was given away free with the purchase of a piece of equipment that connected the computer to a cassette recorder.” The fully assembled and tested Apple II followed later that year. By 1977 microcomputers had begun to enter the mainstream. You could stroll down to your local Radio Shack and buy a TRS-80 microcomputer off the shelf, something absolutely unheard of just a year earlier.
The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution by Walter Isaacson
1960s counterculture, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, Apple II, augmented reality, back-to-the-land, beat the dealer, Bill Gates: Altair 8800, bitcoin, Bob Noyce, Buckminster Fuller, Byte Shop, c2.com, call centre, citizen journalism, Claude Shannon: information theory, Clayton Christensen, commoditize, computer age, crowdsourcing, cryptocurrency, Debian, desegregation, Donald Davies, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, en.wikipedia.org, Firefox, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, Haight Ashbury, Howard Rheingold, Hush-A-Phone, HyperCard, hypertext link, index card, Internet Archive, Jacquard loom, Jaron Lanier, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Joseph-Marie Jacquard, Leonard Kleinrock, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, Mother of all demos, new economy, New Journalism, Norbert Wiener, Norman Macrae, packet switching, PageRank, Paul Terrell, pirate software, popular electronics, pre–internet, RAND corporation, Ray Kurzweil, RFC: Request For Comment, Richard Feynman, Richard Stallman, Robert Metcalfe, Rubik’s Cube, Sand Hill Road, Saturday Night Live, self-driving car, Silicon Valley, Silicon Valley startup, Skype, slashdot, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Steven Pinker, Stewart Brand, technological singularity, technoutopianism, Ted Nelson, The Coming Technological Singularity, The Nature of the Firm, The Wisdom of Crowds, Turing complete, Turing machine, Turing test, Vannevar Bush, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Whole Earth Review, wikimedia commons, William Shockley: the traitorous eight
Harold Singer, “Open Letter to Ed Roberts,” Micro-8 Computer User Group newsletter, Mar. 28, 1976. 80. Author’s interview with Lee Felsenstein. 81. Bill Gates interview, Playboy, July 1994. 82. This section draws from my Steve Jobs (Simon & Schuster, 2011), which was based on interviews with Steve Jobs, Steve Wozniak, Nolan Bushnell, Al Alcorn, and others. The Jobs biography includes a bibliography and source notes. For this book, I reinterviewed Bushnell, Alcorn, and Wozniak. This section also draws on Steve Wozniak, iWoz (Norton, 1984); Steve Wozniak, “Homebrew and How the Apple Came to Be,” http://www.atariarchives.org/deli/homebrew_and_how_the_apple.php. 83. When I posted an early draft of parts of this book for crowdsourced comments and corrections on Medium, Dan Bricklin offered useful suggestions. We got into an exchange about the creation of VisiCalc, and I subsequently added this section to the book.
Author’s interview with Bill Gates. 116. Michael Riordan and Lillian Hoddeson, “Crystal Fire,” IEEE SCS News, Spring 2007, adapted from Crystal Fire (Norton, 1977). 117. Author’s interviews with Lee Felsenstein, Steve Wozniak, Steve Jobs, and Bob Albrecht. This section also draws from the accounts of the Homebrew Computer Club origins in Wozniak, iWoz (Norton, 2006); Markoff, What the Dormouse Said, 4493 and passim; Levy, Hackers, 201 and passim; Freiberger and Swaine, Fire in the Valley, 109 and passim; Steve Wozniak, “Homebrew and How the Apple Came to Be,” http://www.atariarchives.org/deli/homebrew_and_how_the_apple.php; the Homebrew archives exhibit at the Computer History Museum; the Homebrew newsletter archives, http://www.digibarn.com/collections/newsletters/homebrew/; Bob Lash, “Memoir of a Homebrew Computer Club Member,” http://www.bambi.net/bob/homebrew.html. 118.
When Wired magazine featured maker culture in its April 2011 issue, it put a woman engineer on its cover for the first time, the MIT-trained do-it-yourself entrepreneur Limor Fried, whose moniker “ladyada” and company name Adafruit Industries were homages to Ada Lovelace. 31. To listen to Dompier’s Altair play “Fool on the Hill,” go to http://startup.nmnaturalhistory.org/gallery/story.php?ii=46. 32. After they became successful, Gates and Allen donated a new science building to Lakeside and named its auditorium after Kent Evans. 33. Steve Wozniak’s unwillingness to tackle this tedious task when he wrote BASIC for the Apple II would later force Apple to have to license BASIC from Allen and Gates. 34. Reading a draft version of this book online, Steve Wozniak said that Dan Sokol made only eight copies, because they were hard and time-consuming to make. But John Markoff, who reported this incident in What the Dormouse Said, shared with me (and Woz and Felsenstein) the transcript of his interview with Dan Sokol, who said he used a PDP-11 with a high-speed tape reader and punch.
Founders at Work: Stories of Startups' Early Days by Jessica Livingston
8-hour work day, affirmative action, AltaVista, Apple II, Brewster Kahle, business cycle, business process, Byte Shop, Danny Hillis, David Heinemeier Hansson, don't be evil, fear of failure, financial independence, Firefox, full text search, game design, Googley, HyperCard, illegal immigration, Internet Archive, Jeff Bezos, Joi Ito, Justin.tv, Larry Wall, Maui Hawaii, Menlo Park, Mitch Kapor, nuclear winter, Paul Buchheit, Paul Graham, Peter Thiel, Richard Feynman, Robert Metcalfe, Ruby on Rails, Sam Altman, Sand Hill Road, side project, Silicon Valley, slashdot, social software, software patent, South of Market, San Francisco, Startup school, stealth mode startup, Steve Ballmer, Steve Jobs, Steve Wozniak, web application, Y Combinator
Little did he know that I was actually up all night writing a business plan, not partying. C H A P T 3 E R Steve Wozniak Cofounder, Apple Computer If any one person can be said to have set off the personal computer revolution, it might be Steve Wozniak. He designed the machine that crystallized what a desktop computer was: the Apple II. Wozniak and Steve Jobs founded Apple Computer in 1976. Between Wozniak’s technical ability and Jobs’s mesmerizing energy, they were a powerful team. Woz first showed off his home-built computer, the Apple I, at Silicon Valley’s Homebrew Computer Club in 1976. After Jobs landed a contract with the Byte Shop, a local computer store, for 100 preassembled machines, Apple was launched on a rapid ascent. Woz soon followed with the machine that made the company: the Apple II. He single-handedly designed all its hardware and software—an extraordinary feat even for the time.
So a bunch of Apple engineers and marketing people got to benefit from going public. Otherwise, they’d have no stock at all. Mike Markkula kind of felt that some of these people didn’t deserve it; some people shouldn’t get stock. But I disagreed with him on that. Nobody stopped me, so I did it. Livingston: But you still kept enough stock for yourself to buy a house, right? Wozniak: The money I got from Apple employees, I used to buy a house. It was kind of an early state to be selling out 15 percent of your stock, but hey, that was a great opportunity for me. When I designed the Apple stuff, I never thought in my life I would have enough money to fly to Hawaii or make a down payment on a house. So it was huge deal for me. Steve Wozniak 59 Steve Jobs (left) and Steve Wozniak (right) in 1975 with a blue box Photo by Margret Wozniak C H A P T 4 E R Joe Kraus Cofounder, Excite Joe Kraus started Excite (originally called Architext) in 1993 with five Stanford classmates.
Mike Scott was starting to make some real rash, quick decisions, and not be as careful as was needed, and as he’d been in the past. The board gave him another job and he wrote a very shocking resignation letter that, basically, life was too important for this political type stuff. It was sad to see him go because he supported good people so well in the company. Steve Wozniak 47 Livingston: What about Ron Wayne? Wasn’t he one of the founders? Wozniak: Yes, but not when we incorporated as a real company. We had two phases. One was as a partnership with Steve Jobs for the Apple I, and then for the Apple II, we became a corporation, Apple Computer, Incorporated. Steve knew Ron at Atari and liked him. Ron was a super-conservative guy. I didn’t know anything about politics of any sort; I avoided it. But he had read all these right-wing books like None Dare Call it Treason, and he could rattle the stuff off.
Start With Why: How Great Leaders Inspire Everyone to Take Action by Simon Sinek
Apple II, Apple's 1984 Super Bowl advert, Black Swan, business cycle, commoditize, hiring and firing, John Markoff, low cost airline, Nick Leeson, RAND corporation, risk tolerance, Ronald Reagan, shareholder value, Steve Ballmer, Steve Jobs, Steve Wozniak, The Wisdom of Crowds, trade route
They hung out with hippie types who shared their beliefs, but they saw a different way to change the world that didn’t require protesting or engaging in anything illegal. Steve Wozniak and Steve Jobs came of age in this time. Not only was the revolutionary spirit running high in Northern California, but it was also the time and place of the computer revolution. And in this technology they saw the opportunity to start their own revolution. “The Apple gave an individual the power to do the same things as any company,” Wozniak recounts. “For the first time ever, one person could take on a corporation simply because they had the ability to use the technology.” Wozniak engineered the Apple I and later the Apple II to be simple enough for people to harness the power of the technology. Jobs knew how to sell it. Thus was born Apple Computer. A company with a purpose—to give the individual to power to stand up to established power.
New York: Farrar, Straus and Giroux, 2005. 140 “If it hadn’t been for my big brother”: Bob Thomas, Building a Company: Roy O. Disney and the Creation of an Entertainment Empire. New York: Disney Editions, 1998. 142 Herb Kelleher was able to personify and preach the cause of freedom: Kevin Freiberg and Jackie Freiberg, Nuts! Southwest Airlines’ Crazy Recipe for Business and Personal Success. New York: Broadway, 1998. 142 Steve Wozniak is the engineer who made the Apple work: Steve Wozniak, personal interview, November 2008. 143 Bill Gates and Paul Allen went to high school together in Seattle: Randy Alfred, “April 4, 1975: Bill Gates, Paul Allen Form a Little Partnership,” Wired, April 4, 1975, http://www.wired.com/science/discoveries/news/2008/04/dayintech_0404. 145 Oprah Winfrey once gave away a free car: Ann Oldenburg, “7M car giveaway stuns TV audience,” USA Today, September 13, 2004, http://www.usatoday.com/life/people/2004-09-13-oprah-cars_x.htm. 150 the Education for Employment Foundation: http://www.efefoundation.org/homepage.html; Lisa Takeuchi Cullen, “Gainful Employment,” Time, September 20, 2007, http://www.time.com/time/magazine/article/0,9171,1663851,00.html; Ron Bruder, personal interview, February 2009.
But it wasn’t until 1976, nearly three years after the end of America’s military involvement in the Vietnam conflict, that a different revolution ignited. They aimed to make an impact, a very big impact, even challenge the way people perceived how the world worked. But these young revolutionaries did not throw stones or take up arms against an authoritarian regime. Instead, they decided to beat the system at its own game. For Steve Wozniak and Steve Jobs, the cofounders of Apple Computer, the battlefield was business and the weapon of choice was the personal computer. The personal computer revolution was beginning to brew when Wozniak built the Apple I. Just starting to gain attention, the technology was primarily seen as a tool for business. Computers were too complicated and out of the price range of the average individual. But Wozniak, a man not motivated by money, envisioned a nobler purpose for the technology. He saw the personal computer as a way for the little man to take on a corporation.
Troublemakers: Silicon Valley's Coming of Age by Leslie Berlin
AltaVista, Apple II, Asilomar, Asilomar Conference on Recombinant DNA, beat the dealer, Bill Gates: Altair 8800, Bob Noyce, Byte Shop, Clayton Christensen, cloud computing, computer age, discovery of DNA, don't be evil, Donald Knuth, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Thorp, El Camino Real, fear of failure, Fellow of the Royal Society, financial independence, game design, Haight Ashbury, hiring and firing, industrial robot, informal economy, Internet of things, inventory management, John Markoff, Kickstarter, Kitchen Debate, Leonard Kleinrock, manufacturing employment, Mark Zuckerberg, Menlo Park, Minecraft, Mother of all demos, packet switching, Ralph Nader, Robert Metcalfe, rolodex, Ronald Reagan, Sand Hill Road, Silicon Valley, Silicon Valley startup, Snapchat, software as a service, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, union organizing, upwardly mobile, William Shockley: the traitorous eight, women in the workforce
Harriet Stix, “A UC Berkeley Degree Is Now the Apple of Steve Wozniak’s Eye,” Los Angeles Times, May 14, 1986. 18. Marilyn Chase, “Technical Flaws Plague Apple’s New Computer,” Wall Street Journal, April 15, 1981. Apple III prices ranged from $4,300 to nearly $8,000, compared to the Apple II systems at about half the cost. 19. Apple fixed the problems and brought the Apple III back in late 1981—“Let me re-introduce myself,” one advertisement began—but not much software was written for the machine, and it was not anywhere near as popular as the Apple II. (Sales were around 1,000 per month versus the Apple II’s 15,000.) By one estimate (Brent Schlender and Rick Tetzeli, Becoming Steve Jobs: The Evolution of a Reckless Upstart into a Visionary Leader [New York: Crown Business, 2015]: 72), before the Apple III was discontinued in 1984, only 120,000 had been sold.
“Wozniak recalls that” (in footnote): Michael Moritz, Return to the Little Kingdom: How Apple and Steve Jobs Changed the World (New York: Overlook Press, 2010): 186. 18. The remaining shares were set aside for yet-to-be-hired employees. Confidential Private Placement memorandum, Nov. 18, 1977: 9, ACM; Apple IPO Prospectus: 25. “Big time” (in footnote): Markkula, interview by author, Feb. 24, 2016. On Ron Wayne (in footnote): Ronald G. Wayne, Adventures of an Apple Founder (2010: 512k Entertainment): 64, 105-6; Atari Standards Drafting Manual, SB. 19. “Apple Computer (A),” Graduate School of Business, Stanford University, S-BP-229(A): 4. 20. Wendy Quiones, “Pioneering a Revolution: Apple’s Steve Jobs and Steve Wozniak,” Boston Computer Update, July–August 1981. 21. Norman Sklarewitz, “A Used Volkswagen Van and a $500 Commission Were the Starting Capital for Apple Computer,” SF Executive, December 1979. 22.
There Are No Standards Yet MIKE MARKKULA Even by the time Mike Markkula visited Steve Jobs and Steve Wozniak in the garage in the fall of 1976, Apple was a profitable, albeit very small and very amateur, operation. The circuit boards sold to the Byte Shop for $500 each cost Apple about $220 to assemble.1 Before Markkula, however, Apple was a business by only the loosest definition. The family bedrooms and garage were rent free. The sales force was Jobs and Wozniak driving around to electronics stores and asking the owners if they wanted to sell Apple computers.2 The only two people being paid for their labor were Jobs’s sister and a friend, Dan Kottke, who earned $1 per board and $4 per hour, respectively, for their work. Jobs and Wozniak had come up with the $666.66 retail price for the Apple I by adding 30 percent to the $500 they were charging the Byte Shop and rounding so that the price would contain repeating digits—something that Wozniak enjoyed seeing.3 Markkula’s note card commitment had been to help promising entrepreneurs in any way he could one day per week.
Commodore: A Company on the Edge by Brian Bagnall
Apple II, belly landing, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, Douglas Engelbart, Douglas Engelbart, Firefox, game design, index card, inventory management, Isaac Newton, low skilled workers, Menlo Park, packet switching, pink-collar, popular electronics, prediction markets, pre–internet, QWERTY keyboard, Robert Metcalfe, Robert X Cringely, Silicon Valley, special economic zone, Steve Jobs, Steve Wozniak, Ted Nelson
They couldn’t put an R/F modulator in it to hook to your TV set because obviously that was something for the home.” Without an R/F modulator, the Apple II was too complicated for inexperienced users. “The PET and the TRS-80 both came with their own monitors, so they were a more appropriate solution for most people than the Apple II was,” says Yannes. The original design by Steve Wozniak also had several flaws. “Right after the Apple II came out, Electronic Engineering Times wrote a story about the three major design flaws that Woz made on the Apple II,” says Peddle. “He didn’t understand the ways the  chipset worked and some other electronics stuff.” In response to these problems, Apple hired an engineer to redesign Wozniak’s motherboard. “There was a guy who was hired at Apple to redesign the Apple II and make it real engineering without offending Woz,” explains Peddle.
I wanted to do something that helped the world in some way, but I was in Oshkosh.” After a tour of duty in Vietnam and a business degree, Tomczyk travelled to Silicon Valley in 1979. “I used to hang out at Apple almost daily. I would hang out with Steve Wozniak and Andy Hertzfeld and some of the people developing the Apple computers because I was planning to go into the industry and I wanted to learn.” Although he was not an Apple employee, he walked the halls freely. “I was kind of like a groupie at Apple. They let me come and go as I wanted. I could walk around even without an ID tag, which was forbidden at Apple. People used to ask me, ‘Where’s your ID tag? Go get your visitor badge!’ And I used to just wave them off.” After six months as Tramiel’s personal assistant, Tomczyk felt ready to make his contribution with the VIC computer launch.
“Steve [Jobs] is a very, very charismatic guy,” recalls Leonard Tramiel. However, the mediocre sales of the Apple I computer, a period before Markkula and McKenna, testifies to his lack of results compared to other kit computers. “It’s really easy to get carried away with what he is saying, but as far as actually producing sales and making money and selling machines, Commodore did a far better job than Apple.” Although Steve Jobs was a natural, Steve Wozniak lacked a compelling persona. “If you were to read the literature during that time, you will discover that [McKenna] probably took Woz to two places and then dumped him because Woz just didn’t come across as smart and interesting,” recalls Peddle. In May 1977, Apple moved into their first company headquarters in Palo Alto, close to Commodore. McKenna launched Apple’s first advertisement in Byte magazine in July 1977, a month after Commodore sold its first PET computers.
Equal Is Unfair: America's Misguided Fight Against Income Inequality by Don Watkins, Yaron Brook
3D printing, Affordable Care Act / Obamacare, Apple II, barriers to entry, Berlin Wall, Bernie Madoff, blue-collar work, business process, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, collective bargaining, colonial exploitation, corporate governance, correlation does not imply causation, creative destruction, Credit Default Swap, crony capitalism, David Brooks, deskilling, Edward Glaeser, Elon Musk, en.wikipedia.org, financial deregulation, immigration reform, income inequality, indoor plumbing, inventory management, invisible hand, Isaac Newton, Jeff Bezos, Jony Ive, laissez-faire capitalism, Louis Pasteur, low skilled workers, means of production, minimum wage unemployment, Naomi Klein, new economy, obamacare, Peter Singer: altruism, Peter Thiel, profit motive, rent control, Ronald Reagan, Silicon Valley, Skype, statistical model, Steve Jobs, Steve Wozniak, The Spirit Level, too big to fail, trickle-down economics, Uber for X, urban renewal, War on Poverty, wealth creators, women in the workforce, working poor, zero-sum game
Spending on the Basics as a Share of Disposable Personal Income,” HumanProgress.org, http://humanprogress.org/static/us-spending-on-basics (accessed April 13, 2015). 5. Steve Wozniak with Gina Smith, iWoz: Computer Geek to Cult Icon (New York: Norton, 2006), pp. 12–13. 6. Ibid., p. 18. 7. Ibid., pp. 54–55. 8. Ibid., pp. 155–56. 9. “National Inventors Hall of Fame,” Ohio History Central, http://www.ohiohistorycentral.org/w/National_Inventors_Hall_of_Fame?rec=1727 (accessed August 31, 2015). 10. Quoted in Sean Rossman, “Apple’s ‘The Woz’ Talks Jobs, Entrepreneurship,” Tallahassee Democrat, November 6, 2014, http://www.tallahassee.com/story/news/local/2014/11/05/apples-woz-talks-jobs-entrepreneurship/18561425/ (accessed April 13, 2015). 11. Quoted in Alec Hogg, “Apple’s ‘Other’ Steve—Wozniak on Jobs, Starting a Business, Changing the World, and Staying Hungry, Staying Foolish,” BizNews.com, February 17, 2014, http://www.biznews.com/video/2014/02/17/apples-other-steve-wozniak-on-jobs-starting-a-business-changing-the-world/ (accessed April 13, 2015). 12.
But we do live on a Glorious Earth, where we can make life amazing. And it can be amazing for everyone, because it turns out that the way we improve our lives—ingenuity and effort—is not a fixed-sum game, where we battle over a static amount of wealth. We produce wealth, and there is no limit to how much wealth we can produce. Who Created the Modern World? In his autobiography, Apple cofounder Steve Wozniak, or Woz, as he’s usually called, describes how his dad, an engineer, would explain to the four-year-old Woz how electronics worked. “I remember sitting there and being so little, and thinking: ‘Wow, what a great, great world he’s living in,’” Woz recalls. “I mean, that’s all I thought: ‘Wow.’ For people who know how to do this stuff—how to take these little parts and make them work together to do something—well, these people must be the smartest people in the world. . . .
Quoted in Alec Hogg, “Apple’s ‘Other’ Steve—Wozniak on Jobs, Starting a Business, Changing the World, and Staying Hungry, Staying Foolish,” BizNews.com, February 17, 2014, http://www.biznews.com/video/2014/02/17/apples-other-steve-wozniak-on-jobs-starting-a-business-changing-the-world/ (accessed April 13, 2015). 12. Walter Isaacson, Steve Jobs (New York: Simon & Schuster, 2011), p. 295. 13. Ibid., pp. 308, 318. 14. Ibid., p. 317. 15. Ibid., p. 337. 16. Ibid., pp. 318–19. 17. Ibid., p. 329. 18. William J. Bernstein, The Birth of Plenty (New York: McGraw-Hill, 2004), p. 125. 19. Isaacson, Steve Jobs, pp. 76–77. 20. Ibid., pp. 340–43. 21. Ayn Rand, Atlas Shrugged (New York: Penguin, 1999), p. 1065. 22. David Harriman (ed.), Journals of Ayn Rand (New York: Plume, 1999), p. 421. 23. Angus Deaton, The Great Escape: Health, Wealth, and the Origins of Inequality (Princeton, NJ: Princeton, 2013), pp. 45–46. 24.
The Master Switch: The Rise and Fall of Information Empires by Tim Wu
accounting loophole / creative accounting, Alfred Russel Wallace, Apple II, barriers to entry, British Empire, Burning Man, business cycle, Cass Sunstein, Clayton Christensen, commoditize, corporate raider, creative destruction, disruptive innovation, don't be evil, Douglas Engelbart, Douglas Engelbart, Howard Rheingold, Hush-A-Phone, informal economy, intermodal, Internet Archive, invention of movable type, invention of the telephone, invisible hand, Jane Jacobs, John Markoff, Joseph Schumpeter, Menlo Park, open economy, packet switching, PageRank, profit motive, road to serfdom, Robert Bork, Robert Metcalfe, Ronald Coase, sexual politics, shareholder value, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, Telecommunications Act of 1996, The Chicago School, The Death and Life of Great American Cities, the market place, The Wisdom of Crowds, too big to fail, Upton Sinclair, urban planning, zero-sum game
He wanted it that way. The Apple II was my machine, and the Mac was his.” Apple’s origins were pure Steve Wozniak, but as everyone knows, it was the other founder, Steve Jobs, whose ideas made Apple what it is today. Jobs maintained the early image that he and Wozniak created, but beginning with the Macintosh in the 1980s, and accelerating through the age of the iPod, iPhone, and iPad, he led Apple computers on a fundamentally different track. Jobs is a man who would seem as much at home in Victorian England as behind the counter of a sushi bar: he is an apostle of perfectibility and believes in a single best way of performing any task and presenting the results. As one might expect, his ideas embody an aesthetic philosophy as much as a sense of functionality, which is why Apple’s products look so good while working so well.
The history of the firm must be understood in this light. For while founders do set the culture of a firm, they cannot dictate it in perpetuity; as Wozniak withdrew from the operation, Apple became more and more concerned with, as it were, the aesthetics of radicalism than with its substance. Steve Wozniak is not the household name that Steve Jobs is, but his importance to communications and culture in the postwar period merits a closer look. While Apple’s wasn’t the only personal computer invented in the 1970s, it was the most influential. For the Apple II took personal computing, an obscure pursuit of the hobbyist, and made it into a nationwide phenomenon, one that would ultimately transform not just computing, but communications, culture, entertainment, business—in short, the whole productive part of American life.
“It’s pretty rare to make your engineering an art,” said Wozniak, “but that’s how it should be.”8 The original Apple had a hood; and as with a car, the owner could open it up and get at the guts of the machine. Indeed, although it was a fully assembled device, not a kit like earlier PC products, one was encouraged to tinker with the innards, to soup it up, make it faster, add features, whatever. The Apple’s operating system, using a form of BASIC as its programming language and operating environment, was, moreover, one that anyone could program. It made it possible to write and sell one’s programs directly, creating what we now call the “software” industry. In 2006, I briefly met with Steve Wozniak on the campus of Columbia University. “There’s a question I’ve always wanted to ask you,” I said. “What happened with the Mac? You could open up the Apple II, and there were slots and so on, and anyone could write for it.
Always Day One: How the Tech Titans Plan to Stay on Top Forever by Alex Kantrowitz
accounting loophole / creative accounting, Albert Einstein, AltaVista, Amazon Web Services, augmented reality, Automated Insights, autonomous vehicles, Bernie Sanders, Clayton Christensen, cloud computing, collective bargaining, computer vision, Donald Trump, drone strike, Elon Musk, Firefox, Google Chrome, hive mind, income inequality, Infrastructure as a Service, inventory management, iterative process, Jeff Bezos, job automation, Jony Ive, knowledge economy, Lyft, Mark Zuckerberg, Menlo Park, new economy, Peter Thiel, QR code, ride hailing / ride sharing, self-driving car, Silicon Valley, Skype, Snapchat, Steve Ballmer, Steve Jobs, Steve Wozniak, Tim Cook: Apple, uber lyft, wealth creators, zero-sum game
utm_source=Memberful&utm_campaign=131ddd5a64-weekly_article_2019_01_07&utm_medium=email&utm_term=0_d4c7fece27-131ddd5a64-110945413. “I’m happy with my iPhone 8”: Balakrishnan, Anita, and Deirdre Bosa. “Apple Co-Founder Steve Wozniak: iPhone X Is the First iPhone I Won’t Buy on ‘Day One.’” CNBC. CNBC, October 23, 2017. https://www.cnbc.com/2017/10/23/apple-co-founder-steve-wozniak-not-upgrading-to-iphone-x-right-away.html. Cook, in an interview with CNBC: “CNBC Exclusive: CNBC Transcript: Apple CEO Tim Cook Speaks with CNBC’s Jim Cramer Today.” CNBC. CNBC, January 8, 2019. https://www.cnbc.com/2019/01/08/exclusive-cnbc-transcript-apple-ceo-tim-cook-speaks-with-cnbcs-jim-cramer-today.html. Apple had Siri: Gross, Doug. “Apple Introduces Siri, Web Freaks Out.” CNN. Cable News Network, October 4, 2011. https://www.cnn.com/2011/10/04/tech/mobile/siri-iphone-4s-skynet/index.html.
A struggling Chinese economy and the brewing trade war between China and the US played a part in the lower iPhone sales, but another factor loomed larger: Smartphones, after years of big advances, had become good enough that owning a top-of-the-line model was no longer important. People could wait longer to upgrade, and that put a dent in Apple sales. In November 2018, Apple had said it would no longer report unit sales of the iPhone, an indication of what was to come. Apple cofounder Steve Wozniak himself offered a convincing argument that the iPhone was reaching a point where upgrading wasn’t all that necessary. “I’m happy with my iPhone 8, which is the same as the iPhone 7, which is the same as the iPhone 6,” he said in a 2017 interview, adding that he would not be upgrading to the iPhone X. “Look at cars. For hundreds of years a car kind of had four wheels, about the size that would fit people inside, and headlights.
Apple’s values were implied: it was among this group, a troublemaker and not a faceless corporation. Today, Apple is no longer crazy, or a rebel, or a troublemaker. It’s a trillion-dollar Goliath with power over the small guys it once counted itself among. Its products, once revolutionary, are now establishment. Its messaging has therefore shifted. What does Apple stand for? The iPhone. And to market it, its value is privacy. A Drive down 280 As I started wrapping my reporting for this chapter, I wondered where Apple will go now that the iPhone has reached “a form that’s right” and the company’s inventive muscle seems to have atrophied. So I crossed my fingers and wrote to Steve Wozniak, figuring he might have some ideas. After a few emails, Wozniak told me to meet him the following Wednesday morning at the Original Hick’ry Pit, a barbecue restaurant near Campbell, California, not far from Apple’s campus.
Inventors at Work: The Minds and Motivation Behind Modern Inventions by Brett Stern
Apple II, augmented reality, autonomous vehicles, bioinformatics, Build a better mousetrap, business process, cloud computing, computer vision, cyber-physical system, distributed generation, game design, Grace Hopper, Richard Feynman, Silicon Valley, skunkworks, Skype, smart transportation, speech recognition, statistical model, stealth mode startup, Steve Jobs, Steve Wozniak, the market place, Yogi Berra
Calvert: These are the things that an inventor really needs to learn and work out before taking that big step of sending a patent application to the USPTO—hopefully as the prelude to starting up their own business. 1 www.uspto.gov/inventors/independent/eye/201206/index.jsp 2 www.uiausa.org CHAPTER 23 Steve Wozniak Co-Founder Apple Computer A Silicon Valley icon and philanthropist for more than thirty years, Steve Wozniak helped shape the computing industry with his design of Apple’s first line of products, the Apple I and II, and influenced the popular Macintosh. In 1976, Wozniak and Steve Jobs founded Apple Computer Inc. with Wozniak’s Apple I personal computer. The following year he introduced his Apple II personal computer, featuring a central processing unit, a keyboard, color graphics, and a floppy disk drive. The Apple II was integral to launching the personal computer industry. Wozniak is named sole inventor on the US patent for “microcomputer for use with video display.”
His bestselling autobiography—iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It—was published in 2006 (W. W. Norton). His television appearances include Kathy Griffin: My Life on the D-List, Dancing with the Stars, and The Big Bang Theory. Brett Stern: You talk about having an engineering side and a human side. Any thoughts on what the difference is, and how you define those sides? Steve Wozniak: I talk about the difference between the engineering side and the human side in two different senses. One is the general sense of developing technology products. If you look at Apple history, you’ll find out that the most important thing that made Apple great—and made Steve Jobs such a great person—was our focus on understanding the users more than understanding the technology.
Tim Leatherman, Folding Hand Tools Chapter 15. Reyn Guyer, Toys Chapter 16. Bernhard van Lengerich, Food Manufacturing Chapter 17. Curt Croley, Shane MacGregor, Graham Marshall, Mobile Devices Chapter 18. Matthew Scholz, Healthcare Products Chapter 19. Daria Mochly-Rosen, Drugs Chapter 20. Martin Keen, Footwear Chapter 21. Kevin Deppermann, Seed Genomes Chapter 22. John Calvert, Elizabeth Dougherty, USPTO Chapter 23. Steve Wozniak, Personal Computers Index About the Author Brett Stern is an industrial designer and inventor living in Portland, Oregon. He holds eight utility patents covering surgical instruments, medical implants, and robotic garmentmanufacturing systems. He holds trademarks in 34 countries on a line of snack foods that he created. He has worked as an industrial design consultant for such clients as Pfizer, Revlon, and Saatchi & Saatchi, and as a costume materials technologist for Warner Bros.
Becoming Steve Jobs: The Evolution of a Reckless Upstart Into a Visionary Leader by Brent Schlender, Rick Tetzeli
Albert Einstein, Apple II, Apple's 1984 Super Bowl advert, Bill Gates: Altair 8800, Bob Noyce, Byte Shop, Charles Lindbergh, computer age, corporate governance, El Camino Real, Isaac Newton, John Markoff, Jony Ive, Kickstarter, Marc Andreessen, market design, McMansion, Menlo Park, Paul Terrell, popular electronics, QWERTY keyboard, Ronald Reagan, Sand Hill Road, side project, Silicon Valley, Silicon Valley startup, skunkworks, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Tim Cook: Apple, Wall-E, Watson beat the top human players on Jeopardy!, Whole Earth Catalog
The traffic on 280 and 101 had been at a standstill much of the way up from Cupertino, way down south in Silicon Valley, where the company he’d founded, Apple Computer, had its headquarters, and where he had just suffered through a meeting of Apple’s board of directors, which was chaired by the venerable Arthur Rock. He and Rock didn’t see eye-to-eye on much of anything. Rock treated him like a child. Rock loved order, he loved processes, he believed that tech companies grew in certain ways according to certain rules, and he subscribed to these beliefs because he’d seen them work before, most notably at Intel, the great Santa Clara chipmaker that he had backed early on. Rock was perhaps the most notable tech investor of his time, but he in fact had been reluctant to back Apple at first, largely because he’d found Steve and his partner Steve Wozniak unpalatable. He didn’t see Apple the way Jobs saw it—as an extraordinary company that would humanize computing and do so with a defiantly unhierarchical organization.
Other Newspapers and Magazines BusinessWeek/BloombergBusinessweek Esquire Fast Company Fortune New York Times The New Yorker Newsweek San Francisco Chronicle San Jose Mercury News Time Wall Street Journal Wired Websites allaboutstevejobs.com apple.com apple-history.com Computer History Museum: www.computerhistory.org/atchm/steve-jobs/ cultofmac.com donmelton.com/2014/04/10/memories-of-steve/ everystevejobsvideo.com Fastcodesign.com, a Fast Company website that a focuses on design news, May 22, 2014, http://www.fastcodesign.com/3030923/4-myths-about-apple-design-from-an-ex-apple-designer Forbes billionaires list: “Two Decades of Wealth,” www.forbes.com/static_html/rich400/2002/timemapFLA400.html foundersatwork.com; interview with Stephen Wozniak, www.foundersatwork.com/steve-wozniak.html Gartner Group: http://www.gartner.com/newsroom/id/2301715 Golden Gate Weather: http://ggweather.com/sjc/daily_records.html#September National Cancer Institute: http://www.cancer.gov/cancertopics/pdq/treatment/isletcell/HealthProfessional National Historic Trust for Historic Preservation: preservationnation.org (Jackling Mansion details) National Mining Hall of Fame, Leadville, Co.: http://www.mininghalloffame.org/inductee/jackling news.cnet.com paloalto.patch.com/groups/opinion/p/my-neighbor-steve-jobs quora.com; http://www.quora.com/Steve-Jobs/What-are-the-best-stories-about-people-randomly-meeting-Steve-Jobs/answer/Tim-Smith-18.
Long before Internet mania started churning out wunderkinds of the week, Jobs was technology’s original superstar, the real deal with an astounding, substantial record. The circuit boards he and Steve Wozniak had assembled in a garage in Los Altos had spawned a billion-dollar company. The personal computer seemed to have unlimited potential, and as the cofounder of Apple Computer, Steve Jobs had been the face of all those possibilities. But then, in September of 1985, he had resigned under pressure, shortly after telling the company’s board of directors that he was courting some key Apple employees to join him in a new venture to build computer “workstations.” The fascinated media had thoroughly dissected his departure, with both Fortune and Newsweek putting the ignominious saga on their covers. In the six months since, the details of his new startup had been kept hush-hush, in part because Apple had filed lawsuits trying to prevent Jobs from hiring away its employees.
The Self-Made Billionaire Effect: How Extreme Producers Create Massive Value by John Sviokla, Mitch Cohen
business cycle, Cass Sunstein, Colonization of Mars, corporate raider, Daniel Kahneman / Amos Tversky, Elon Musk, Frederick Winslow Taylor, game design, global supply chain, James Dyson, Jeff Bezos, John Harrison: Longitude, Jony Ive, loss aversion, Mark Zuckerberg, market design, old-boy network, paper trading, RAND corporation, randomized controlled trial, Richard Thaler, risk tolerance, self-driving car, Silicon Valley, smart meter, Steve Ballmer, Steve Jobs, Steve Wozniak, Tony Hsieh, Toyota Production System, young professional
Compare Boone Pickens’s resiliency to the hesitancy that affected Ron Wayne, an original partner in Apple Computer. Wayne had started a slot machine business that failed, swallowing $50,000 of savings. After that failure he went to work at Atari, where he met Steve Jobs. When Jobs later asked Wayne to join Apple Computer as a third partner to balance and adjudicate between Jobs and the engineering wunderkind Steve Wozniak, Wayne was initially enthusiastic. But then it became clear that they were going to structure the nascent Apple Computer as a partnership. Wayne, who was significantly older than his partners, was worried about the personal liability he would incur if all the borrowing and spending Jobs was doing to manufacture the Apple I at volume did not pan out. The fear overcame him and a few days after they filed the business paperwork he pulled out.33 HOW EXECUTIVES CAN LEARN TO REVERSE THE RISK EQUATION Producers aren’t knocked out of the entrepreneurial game by defeats—even those that seem entirely devastating.
He has since founded Vatera Healthcare Partners, a health venture capital firm, and Arisaph Pharmaceuticals, a biotech discovery firm. Steve Jobs 1955–2011, United States Apple Computer, Pixar Jobs was a game designer at Atari when he, Steve Wozniak, and Ronald Wayne launched Apple Computer in 1976 to market a personal computer Wozniak had invented. The first Apple PCs proved a huge success, but later products floundered. Infighting led to Jobs’s 1985 ouster. He founded NeXT Computer and bought the Pixar animation studio from George Lucas. Pixar’s 1995 IPO made Jobs a billionaire. Two years later, Apple bought NeXT and reinstated Jobs as CEO, ushering in an era of tremendous innovation and growth driven by the iPod, iPhone, and iPad. Steve Jobs died of pancreatic cancer in 2011. Kirk Kerkorian b. 1917, United States International Leisure, MGM/United Artists, MGM Resorts International A flight instructor as a young man, Kirk Kerkorian then risked his life flying mosquito bombers for the Canadian Royal Air Force during World War II.
He stood behind his people and he was happy that he could do it.” THE PREVALENCE OF PRODUCER-PERFORMER PAIRS More than half of the billionaires in our study sample started their businesses as part of a Producer-Performer team.2 The number jumps to 60 percent when we remove financial industry billionaires from the sample.3 Some famous examples include Steve Jobs (Producer) and Steve Wozniak (engineering Performer) of Apple; Nike’s Bill Bowerman (Producer) and Phil Knight (Performer); and Amancio Ortega (Producer) and his first wife, Rosalia Mera (Performer), who together founded the apparel giant Zara. The prominence of pairs among the billionaires we observed cuts against a lot of what we always thought we knew about how people feel productive and successful in their professional lives. Yet once we saw it in the data and began investigating its dynamics, the prominence of a Leadership Partnership began to make intuitive sense.
Racing the Beam: The Atari Video Computer System by Nick Montfort, Ian Bogost
At the time, the chip was the cheapest CPU on the market by far, and it was also faster than competing chips like the Motorola 6800 and the Intel 8080.19 The 6502’s low cost and high performance made it an immensely popular processor for more than a decade. The chip drove the Apple I and Apple ][, the Commodore PET and Commodore 64, the Atari 400 and 800 home computers, and the Nintendo Entertainment System (NES). It is still used today in some embedded systems. This chip seemed attractive, as cost was the primary consideration in the design of the Atari VCS. The system needed to be much more affordable than a personal computer, which was still a very rare and expensive commodity. When Apple Computer released the popular Apple ][ in 1977, it cost $1,298, even after Steve Wozniak’s many cost- and componentsaving engineering tricks. The same year, Atari released the VCS for $199.20 The price was just above the console’s manufacturing cost, a common strategy today but an unusual one in the 1970s.
Such was the case for the Tandy TRS-80 and Commodore PET, both also released in 1977. The Apple ][‘s graphics and sound system was implemented in a similar but more sophisticated way, thanks in part to Steve Wozniak’s experience designing an Atari arcade game. As Wozniak explained: A lot of features of the Apple ][ went in because I had designed Breakout for Atari. I had designed it in hardware. I wanted to write it in software now. So that was the reason that color was added in ﬁrst—so that games could be programmed. I sat down one night and tried to put it into BASIC. Fortunately I had written the BASIC myself, so I just burned some new ROMs with line drawing commands, color changing commands, and various BASIC commands that would plot in color. I got this ball bouncing around, and I said, “Well it needs sound,” and I had to add a speaker to the Apple ][. It wasn’t planned, it was just accidental.23 Wozniak engineered capabilities into ROM, burning what he needed onto chips that went onto the motherboard.
Arcade. Distributed by Centauri. 1980. Atari. Pong. Arcade. Designed by Nolan Bushnell. Engineered by Al Alcorn. 1972. Atari. Gran Trak 10. Arcade. 1974. Atari. Touch Me. Arcade. 1974. Atari. Anti-Aircraft. Arcade. 1975. Atari. Home Pong. Engineered by Al Alcorn, Bob Brown, and Harold Lee. 1975. Atari. Breakout. Arcade. Designed by Nolan Bushnell and Steve Bristow. Engineered by Gary Waters and Steve Wozniak. 1976. Atari. Night Driver. Arcade. Programmed by Dave Shepperd. Engineered by Ron Milner, Steve Mayer, and Terry Fowler. 1976. Atari. Air-Sea Battle. Atari VCS. Programmed by Larry Kaplan. 1977. Atari. Basic Math. Atari VCS. Programmed by Gary Palmer. 1977. Atari. Blackjack. Atari VCS. Programmed by Bob Whitehead. 1977. Atari. Combat. Atari VCS. Programmed by Joe Decuir and Larry Wagner. 1977.
The Hacker Crackdown by Bruce Sterling
Apple II, back-to-the-land, game design, ghettoisation, Haight Ashbury, Howard Rheingold, HyperCard, index card, informal economy, Jaron Lanier, Mitch Kapor, pirate software, plutocrats, Plutocrats, Silicon Valley, Steve Wozniak, Steven Levy, Stewart Brand, The Hackers Conference, the scientific method, Whole Earth Catalog, Whole Earth Review
Before computers and their phone-line modems entered American homes in gigantic numbers, phone phreaks had their own special telecommunications hardware gadget, the famous "blue box." This fraud device (now rendered increasingly useless by the digital evolution of the phone system) could trick switching systems into granting free access to long-distance lines. It did this by mimicking the system's own signal, a tone of 2600 hertz. Steven Jobs and Steve Wozniak, the founders of Apple Computer, Inc., once dabbled in selling blue-boxes in college dorms in California. For many, in the early days of phreaking, blue-boxing was scarcely perceived as "theft," but rather as a fun (if sneaky) way to use excess phone capacity harmlessly. After all, the long-distance lines were JUST SITTING THERE.... Whom did it hurt, really? If you're not DAMAGING the system, and you're not USING UP ANY TANGIBLE RESOURCE, and if nobody FINDS OUT what you did, then what real harm have you done?
On the contrary, like most rock musicians, the Grateful Dead have spent their entire adult lives in the company of complex electronic equipment. They have funds to burn on any sophisticated tool and toy that might happen to catch their fancy. And their fancy is quite extensive. The Deadhead community boasts any number of recording engineers, lighting experts, rock video mavens, electronic technicians of all descriptions. And the drift goes both ways. Steve Wozniak, Apple's co-founder, used to throw rock festivals. Silicon Valley rocks out. These are the 1990s, not the 1960s. Today, for a surprising number of people all over America, the supposed dividing line between Bohemian and technician simply no longer exists. People of this sort may have a set of windchimes and a dog with a knotted kerchief 'round its neck, but they're also quite likely to own a multimegabyte Macintosh running MIDI synthesizer software and trippy fractal simulations.
Furthermore, proclaimed the manifesto, the foundation would "fund, conduct, and support legal efforts to demonstrate that the Secret Service has exercised prior restraint on publications, limited free speech, conducted improper seizure of equipment and data, used undue force, and generally conducted itself in a fashion which is arbitrary, oppressive, and unconstitutional." "Crime and Puzzlement" was distributed far and wide through computer networking channels, and also printed in the Whole Earth Review. The sudden declaration of a coherent, politicized counter-strike from the ranks of hackerdom electrified the community. Steve Wozniak (perhaps a bit stung by the NuPrometheus scandal) swiftly offered to match any funds Kapor offered the Foundation. John Gilmore, one of the pioneers of Sun Microsystems, immediately offered his own extensive financial and personal support. Gilmore, an ardent libertarian, was to prove an eloquent advocate of electronic privacy issues, especially freedom from governmental and corporate computer-assisted surveillance of private citizens.
Thinking Machines: The Inside Story of Artificial Intelligence and Our Race to Build the Future by Luke Dormehl
Ada Lovelace, agricultural Revolution, AI winter, Albert Einstein, Alexey Pajitnov wrote Tetris, algorithmic trading, Amazon Mechanical Turk, Apple II, artificial general intelligence, Automated Insights, autonomous vehicles, book scanning, borderless world, call centre, cellular automata, Claude Shannon: information theory, cloud computing, computer vision, correlation does not imply causation, crowdsourcing, drone strike, Elon Musk, Flash crash, friendly AI, game design, global village, Google X / Alphabet X, hive mind, industrial robot, information retrieval, Internet of things, iterative process, Jaron Lanier, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kickstarter, Kodak vs Instagram, Law of Accelerating Returns, life extension, Loebner Prize, Marc Andreessen, Mark Zuckerberg, Menlo Park, natural language processing, Norbert Wiener, out of africa, PageRank, pattern recognition, Ray Kurzweil, recommendation engine, remote working, RFID, self-driving car, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, social intelligence, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, technological singularity, The Coming Technological Singularity, The Future of Employment, Tim Cook: Apple, too big to fail, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!
Early Siri reviews were very positive when the iPhone 4s launched in 2011. Over time, however, cracks began to show. Embarrassingly, Apple co-founder Steve Wozniak – who left Apple decades earlier – was one vocal critic of the service, noting how Apple’s own-brand version seemed less intelligent than the original third-party Siri app. What had won him over about the first Siri, he said, was its ability to correctly answer the questions, ‘What are the five largest lakes in California?’ and ‘What are the prime numbers greater than eighty-seven?’ Now, questions about California’s five largest lakes brought up links to lakefront properties. Questions about prime numbers pointed him to restaurants that served prime ribs. Improvements were clearly needed. Have your AI Speak to my AI While Apple poured its resources into fixing Siri, other companies launched their own competitors.
Virtually none of it was achieved using Good Old-Fashioned AI. The company’s name, of course, was Google. CHAPTER 2 Another Way to Build AI IT IS 2014 and, in the Google-owned London offices of an AI company called DeepMind, a computer whiles away the hours by playing an old Atari 2600 video game called Breakout. The game was designed in the early 1970s by two young men named Steve Jobs and Steve Wozniak, who later went on to start a company called Apple. Breakout is essentially a variation on the bat-and-ball tennis game Pong, except that instead of hitting the square ‘ball’ across the screen to another player, you fire it at a wall of bricks which smash on impact. The goal is to destroy all of the bricks. As we saw in the previous chapter, there is nothing at all unusual about AI playing games. Alan Turing wrote the world’s first chess program as far back as 1947, although computers were not yet powerful enough to run it at the time.
Epub ISBN: 9780753551653 Version 1.0 3 5 7 9 10 8 6 4 2 WH Allen, an imprint of Ebury Publishing, 20 Vauxhall Bridge Road, London SW1V 2SA WH Allen is part of the Penguin Random House group of companies whose addresses can be found at global.penguinrandomhouse.com Copyright © Luke Dormehl 2016 Cover design: Two Associates Luke Dormehl has asserted his right to be identified as the author of this Work in accordance with the Copyright, Designs and Patents Act 1988 First published by WH Allen in 2016 www.eburypublishing.co.uk A CIP catalogue record for this book is available from the British Library ISBN 9780753556740 Chapter 1 fn1 The answer, in case you want to prove yourself as smart as an AI, is 162. Chapter 3 fn1 Coffee, as it turns out, is a good starting point for a discussion about smart devices. Apple’s co-founder Steve Wozniak once said that he could never foresee a robot with enough general intelligence to walk into a strange house and make a cup of coffee. Exploring this hypothesis, some researchers now suggest the ‘coffee test’ as a potential measure for AGI, Artificial General Intelligence. I will discuss AGI later on in this book. Chapter 4 fn1 To be fair to Mitsuku, very few of us would have a good answer if this question were put to us.
A People’s History of Computing in the United States by Joy Lisi Rankin
activist fund / activist shareholder / activist investor, Albert Einstein, Apple II, Bill Gates: Altair 8800, computer age, corporate social responsibility, Douglas Engelbart, Douglas Engelbart, Grace Hopper, Hacker Ethic, Howard Rheingold, Howard Zinn, Jeff Bezos, John Markoff, John von Neumann, Mark Zuckerberg, Menlo Park, Mother of all demos, Network effects, Norbert Wiener, pink-collar, profit motive, RAND corporation, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, the market place, urban planning, Whole Earth Catalog, wikimedia commons
Increasingly, people would have to purchase computers and software (now, devices and apps) for their personal and social computing. BASIC also figures prominently in the history of Apple. Steve Wozniak produced his own “Integer BASIC” for his homemade computer, built around MOS Technology’s 6502 microprocessor chip; he shared Integer BASIC , and he even published programs in Dr. Dobb’s Journal.29 When Wozniak’s high school chum Steve Jobs saw the computer, he proposed they team up to assemble and sell them. They named the computer Apple, and soon began working on a new version, the Apple II. Although Apple declared its philosophy 238 A People’s History of Computing in the United States was “to provide software for our machines f ree or at minimal cost,” Apple sought (aggressively) to sell its hardware.30 W hether they w ere called home computers, hobby computers, microcomputers, or personal computers, they were consumer products, purveyed by Steve Jobs.
The BASIC programs shared freely around the Dartmouth network and on the pages of the People’s Computer Company newsletter fueled the imaginations of many—including Steve Wozniak and Bill Gates. Gates first learned to program in BASIC , the language on which he built his Microsoft empire. Wozniak adapted Tiny BASIC into Integer BASIC to program his homemade computer, the computer that attracted the partnership of Steve Jobs and launched Apple. And the Minnesota software library, mostly BASIC programs including The Oregon Trail, proved to be the ideal complement for the hardware of Apple Computers. During the 1980s, the combination of Apple hardware and MECC software 10 A People’s History of Computing in the United States cemented the transformation from computing citizens to computing consumers.
International Business Machines, much more familiar as IBM, dominated the era when computers were the remote and room-size machines of the military-industrial complex. Then, around 1975, along came the California hobbyists who created personal computers and liberated us from the monolithic mainframes. They were young men in the greater San Francisco Bay Area, and they tinkered in their garages. They started companies: Steve Jobs and Steve Wozniak established Apple; Bill Gates and Paul Allen developed Microsoft. Then, in the 1990s, along came the Internet to connect all of t hose personal computers, and the people using them. Another round of eccentric nerds (still all young white men)—Jeff Bezos, Sergey Brin, Larry Page, and Mark Zuckerberg among them—gave us Amazon, Google, Facebook, and the fiefdoms of Silicon Valley. Walter Isaacson’s The Innovators expands the popular narrative of digital history to include less familiar contributors such as the nineteenth-century mathematician Charles Babbage and the twentieth-century computing visionary J.
The Code: Silicon Valley and the Remaking of America by Margaret O'Mara
"side hustle", A Declaration of the Independence of Cyberspace, accounting loophole / creative accounting, affirmative action, Airbnb, AltaVista, Amazon Web Services, Apple II, Apple's 1984 Super Bowl advert, autonomous vehicles, back-to-the-land, barriers to entry, Ben Horowitz, Berlin Wall, Bob Noyce, Buckminster Fuller, Burning Man, business climate, Byte Shop, California gold rush, carried interest, clean water, cleantech, cloud computing, cognitive dissonance, commoditize, computer age, continuous integration, cuban missile crisis, Danny Hillis, DARPA: Urban Challenge, deindustrialization, different worldview, don't be evil, Donald Trump, Doomsday Clock, Douglas Engelbart, Dynabook, Edward Snowden, El Camino Real, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Frank Gehry, George Gilder, gig economy, Googley, Hacker Ethic, high net worth, Hush-A-Phone, immigration reform, income inequality, informal economy, information retrieval, invention of movable type, invisible hand, Isaac Newton, Jeff Bezos, Joan Didion, job automation, job-hopping, John Markoff, Julian Assange, Kitchen Debate, knowledge economy, knowledge worker, Lyft, Marc Andreessen, Mark Zuckerberg, market bubble, mass immigration, means of production, mega-rich, Menlo Park, Mikhail Gorbachev, millennium bug, Mitch Kapor, Mother of all demos, move fast and break things, move fast and break things, mutually assured destruction, new economy, Norbert Wiener, old-boy network, pattern recognition, Paul Graham, Paul Terrell, paypal mafia, Peter Thiel, pets.com, pirate software, popular electronics, pre–internet, Ralph Nader, RAND corporation, Richard Florida, ride hailing / ride sharing, risk tolerance, Robert Metcalfe, Ronald Reagan, Sand Hill Road, Second Machine Age, self-driving car, shareholder value, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, skunkworks, Snapchat, social graph, software is eating the world, speech recognition, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, supercomputer in your pocket, technoutopianism, Ted Nelson, the market place, the new new thing, There's no reason for any individual to have a computer in his home - Ken Olsen, Thomas L Friedman, Tim Cook: Apple, transcontinental railway, Uber and Lyft, uber lyft, Unsafe at Any Speed, upwardly mobile, Vannevar Bush, War on Poverty, We wanted flying cars, instead we got 140 characters, Whole Earth Catalog, WikiLeaks, William Shockley: the traitorous eight, Y Combinator, Y2K
A great admirer of Intel’s Bob Noyce, Jobs wanted to build a campaign for the Apple II that was as jazzy as the one that had propelled the Intel 8080 into the stratosphere. In a replay of his audacious call to Bill Hewlett a decade earlier, Jobs dialed up the Intel switchboard, where someone connected him with the man who’d crafted that marketing campaign, Regis McKenna Himself. McKenna was unfazed by Apple’s garage setting and the co-founders’ scraggly looks. He’d worked with “lots of strange people” in the Valley already, and he was familiar with the Homebrew scene and the intriguing little enterprises bubbling up from it. The first meeting, however, was a bust. The Steves wanted help placing a Woz-authored article on the Apple II in Byte. It turned out that Steve Wozniak was much better at building elegant motherboards than crafting accessible prose; the piece was a rambling mess better suited for the hobbyist crowd back over at Dr.
The boldfaced headline blazed out at readers opening their copies of The Wall Street Journal the morning of August 13, 1980, sitting atop a full-page advertisement for Apple Computer. Below came a page crowded with print, accompanied by a portrait of its credited author, a professorially bearded Steven P. Jobs, “talk[ing] about the computer, and its effect on society.” The copy repeatedly referred to Jobs as “the inventor” of the personal computer, an artful fabrication that glossed over the fact that the elegant innards of the Apple came from the inventive mind of Jobs’s media-shy co-founder, Steve Wozniak. No insider technical specs here. The ad used simple, evocative language. “Think of the large computers (the mainframes and the minis) as the passenger trains and the Apple personal computer as a Volkswagen,” Jobs wrote. A Beetle might not be as powerful as a passenger train, but it could take you anywhere you wanted to go, on your own schedule.
Of the motley dozens of early start-ups, however, the few that scaled up into million-dollar ventures also involved people like Lore Harp and Carole Ely: people who understood how to run a company, and how to sell the high-tech dream to customers who’d never taken apart a radio set or subscribed to Popular Electronics. RISE OF THE STEVES This of course became the secret of Apple Computer Co., the most legendary Homebrew product of them all. The company wasn’t all that different from the dozens that sprouted from computer-club soil in 1975 and 1976. But it pulled away from the pack because, very early on, it bridged the hacker world of “The O” and storefront computer labs with the Silicon Valley ecosystem of the Wagon Wheel and Sand Hill Road. While baking countercultural credentials into its corporate positioning from the start, Apple was the first personal-computer company to join the silicon capitalists. At the beginning, Steve Wozniak was just another hacker in Gordon French’s damp garage, standing out a bit because he was a few shades more tech-obsessed.
Originals: How Non-Conformists Move the World by Adam Grant
Albert Einstein, Apple's 1984 Super Bowl advert, availability heuristic, barriers to entry, business process, business process outsourcing, Cass Sunstein, clean water, cognitive dissonance, creative destruction, cuban missile crisis, Daniel Kahneman / Amos Tversky, Dean Kamen, double helix, Elon Musk, fear of failure, Firefox, George Santayana, Ignaz Semmelweis: hand washing, Jeff Bezos, job satisfaction, job-hopping, Joseph Schumpeter, Kickstarter, Lean Startup, Louis Pasteur, Mahatma Gandhi, Mark Zuckerberg, meta analysis, meta-analysis, minimum viable product, Nelson Mandela, Network effects, pattern recognition, Paul Graham, Peter Thiel, Ralph Waldo Emerson, random walk, risk tolerance, Rosa Parks, Saturday Night Live, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, Steven Pinker, The Wisdom of Crowds, women in the workforce
Eventually, a major cardinal learned of his work and wrote a letter encouraging Copernicus to publish it. Even then, Copernicus stalled for four more years. His magnum opus only saw the light of day after a young mathematics professor took matters into his own hands and submitted it for publication. Almost half a millennium later, when an angel investor offered $250,000 to Steve Jobs and Steve Wozniak to bankroll Apple in 1977, it came with an ultimatum: Wozniak would have to leave Hewlett-Packard. He refused. “I still intended to be at that company forever,” Wozniak reflects. “My psychological block was really that I didn’t want to start a company. Because I was just afraid,” he admits. Wozniak changed his mind only after being encouraged by Jobs, multiple friends, and his own parents. We can only imagine how many Wozniaks, Michelangelos, and Kings never pursued, publicized, or promoted their original ideas because they were not dragged or catapulted into the spotlight.
If you’re a freewheeling gambler, your startup is far more fragile. Like the Warby Parker crew, the entrepreneurs whose companies topped Fast Company’s recent most innovative lists typically stayed in their day jobs even after they launched. Former track star Phil Knight started selling running shoes out of the trunk of his car in 1964, yet kept working as an accountant until 1969. After inventing the original Apple I computer, Steve Wozniak started the company with Steve Jobs in 1976 but continued working full time in his engineering job at Hewlett-Packard until 1977. And although Google founders Larry Page and Sergey Brin figured out how to dramatically improve internet searches in 1996, they didn’t go on leave from their graduate studies at Stanford until 1998. “We almost didn’t start Google,” Page says, because we “were too worried about dropping out of our Ph.D. program.”
Entrepreneurs who kept their day jobs: Joseph Raffiee and Jie Feng, “Should I Quit My Day Job? A Hybrid Path to Entrepreneurship,” Academy of Management Journal 57 (2014): 936–63. Phil Knight: Bill Katovsky and Peter Larson, Tread Lightly: Form, Footwear, and the Quest for Injury-Free Running (New York: Skyhorse Publishing, 2012); David C. Thomas, Readings and Cases in International Management: A Cross-Cultural Perspective (Thousand Oaks, CA: Sage Publications, 2003). Steve Wozniak: Jessica Livingston, Founders at Work: Stories of Startups’ Early Days (Berkeley, CA: Apress, 2007). “We almost didn’t start Google”: Personal conversations with Larry Page on September 15 and 16, 2014, and “Larry Page’s University of Michigan Commencement Address,” May 2, 2009, http://googlepress.blogspot.com/2009/05/larry-pages-university-of-michigan.html; Google Investor Relations, https://investor.google.com/financial/tables.html.
Intertwingled: The Work and Influence of Ted Nelson (History of Computing) by Douglas R. Dechow
3D printing, Apple II, Bill Duvall, Brewster Kahle, Buckminster Fuller, Claude Shannon: information theory, cognitive dissonance, computer age, conceptual framework, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Snowden, game design, HyperCard, hypertext link, information retrieval, Internet Archive, Jaron Lanier, knowledge worker, linked data, Marc Andreessen, Marshall McLuhan, Menlo Park, Mother of all demos, pre–internet, RAND corporation, semantic web, Silicon Valley, software studies, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, the medium is the message, Vannevar Bush, Wall-E, Whole Earth Catalog
MIT Press, Cambridge, MA 17. Wing JM (2006) Computational thinking. Commun ACM 49(3) 18. Wozniak S (2014) In “Intertwingled: afternoon session #2.” Chapman University, Orange, California. Video timecode: 58:14. http://ibc.chapman.edu/Mediasite/Play/52694e57c4b546f0ba8814ec5d9223ae1d Footnotes 1For example, as Steve Wozniak said at Intertwingled, “At our computer club, the bible was Computer Lib” — referring to the Homebrew Computer Club, from which Apple Computer and other major elements of the turn to personal computers emerged . 2“Computational thinking is the process of recognising aspects of computation in the world that surrounds us, and applying tools and techniques from Computer Science to understand and reason about both natural and artificial systems and processes” . 3“Computational Media” has recently emerged as a name for the type of work that performs this interdisciplinary integration . 4Kodu is both an influential system itself and the basis of Microsoft’s Project Spark, launched in October 2014. 5The first stage of our work is described in “Say it With Systems” .
Technical compromises made in the early days of the World Wide Web undermined Ted’s ability to implement hypertext on a large scale. He continues to rail at this constraint. Forty years after Computer Lib, computers are far more sophisticated and the networks among digital objects are much richer and more complex. It is time to revisit fundamental assumptions of networked computing, such as the directionality of links, a point made by multiple speakers at the symposium—Wendy Hall, Jaron Lanier, Steve Wozniak, and Rob Akcsyn amongst them.1 Fig. 10.3Ordinary hypertext, with multi-directional links. From Literary Machines (Used with permission) 10.2.3 Managing Research Data Managing research data is similarly a problem of defining and maintaining relationships amongst multi-media objects. Research data do not stand alone. They are complex objects that can be understood only in relation to their context, which often includes software, protocols, documentation, and other entities scattered over time and space .
Some of what I showed during my talk is what Steve Jobs saw, and the Macintosh was a result of his glimpse and also interpretations of that glimpse by him and others at Apple. But it missed a number of really important ideas. Many of Ted’s and Doug’s ideas have been missed. So, with all this working against someone like Ted, why bother having visions? Standard schooling is already trying to convert two-eyed children into standard children, that is, into blind children. Why not just put more effort into this and save all the bother? To me, the visionaries are the most important people we have because it is only by comparing their ideas with our normal ideas that we can gauge how we are doing. Otherwise, as it is for most people, normal becomes reality, and they only measure from that less broad view of reality. Toss Ted back into this mix, and you’ve upset the Apple cart—and that’s what we need! This allows us to see that normal is only one of many possible constructions of reality, and some of them could have been much better.
The Start-Up of You: Adapt to the Future, Invest in Yourself, and Transform Your Career by Reid Hoffman, Ben Casnocha
Airbnb, Andy Kessler, Black Swan, business intelligence, Cal Newport, Clayton Christensen, commoditize, David Brooks, Donald Trump, en.wikipedia.org, fear of failure, follow your passion, future of work, game design, Jeff Bezos, job automation, Joi Ito, late fees, lateral thinking, Marc Andreessen, Mark Zuckerberg, Menlo Park, out of africa, Paul Graham, paypal mafia, Peter Thiel, recommendation engine, Richard Bolles, risk tolerance, rolodex, shareholder value, side project, Silicon Valley, Silicon Valley startup, social web, Steve Jobs, Steve Wozniak, Tony Hsieh, transaction costs
In 1975 a group of microcomputer enthusiasts in the Bay Area formed the Homebrew Computer Club and invited those who shared their interests in technology to “come to a gathering of people with like-minded interests. Exchange information, swap ideas, help work on a project, whatever.”8 Five hundred young geeks joined, and of them, twenty went on to start computer companies, including Steve Wozniak, who cofounded Apple. Homebrew helped establish the distinctly Silicon Valley model of disseminating opportunities and information through informal networks (something we’ll discuss in the Network Intelligence chapter). Small, informal networks are still uniquely efficient at circulating ideas. It’s why we still have local PTAs and alumni groups from schools. Book groups. Beekeeping clubs. Conferences and industry meetings.
World-class professionals build networks to help them navigate the world. No matter how brilliant your mind or strategy, if you’re playing a solo game, you’ll always lose out to a team. Athletes need coaches and trainers, child prodigies need parents and teachers, directors need producers and actors, politicians need donors and strategists, scientists need lab partners and mentors. Penn needed Teller. Ben needed Jerry. Steve Jobs needed Steve Wozniak. Indeed, teamwork is eminently on display in the start-up world. Very few start-ups are started by only one person. Everyone in the entrepreneurial community agrees that assembling a talented team is as important as it gets. Venture capitalists invest in people as much as in ideas. VCs will frequently back stellar founders with a so-so idea over mediocre founders with a good idea, on the belief that smart and adaptable people will maneuver their way to something that works.
There was clearly growing market demand for folks who had experience with the Internet. But did I have the skills, and could I make enough connections in the tech industry, to become a hitter? To find out, I tried. I got a job (via a friend of a friend) at Apple Computer in Cupertino. Apple hired me into their user experience group, but shortly after starting on the job I learned that product/market fit—the focus of product management—mattered more than user experience or design. You can develop great and important user interfaces, and Apple certainly did, but if customers don’t need or want the product, they won’t buy. At Apple, and in most companies, the product/market fit questions fall under the purview of the product management group, not user experience. And because product management is vital in any product organization, work experience in the area tends to lead to more diverse career opportunities.
Falter: Has the Human Game Begun to Play Itself Out? by Bill McKibben
23andMe, Affordable Care Act / Obamacare, Airbnb, American Legislative Exchange Council, Anne Wojcicki, artificial general intelligence, Bernie Sanders, Bill Joy: nanobots, Burning Man, call centre, carbon footprint, Charles Lindbergh, clean water, Colonization of Mars, computer vision, David Attenborough, Donald Trump, double helix, Edward Snowden, Elon Musk, ending welfare as we know it, energy transition, Flynn Effect, Google Earth, Hyperloop, impulse control, income inequality, Intergovernmental Panel on Climate Change (IPCC), Jane Jacobs, Jaron Lanier, Jeff Bezos, job automation, life extension, light touch regulation, Mark Zuckerberg, mass immigration, megacity, Menlo Park, moral hazard, Naomi Klein, Nelson Mandela, obamacare, off grid, oil shale / tar sands, pattern recognition, Peter Thiel, plutocrats, Plutocrats, profit motive, Ralph Waldo Emerson, Ray Kurzweil, Robert Mercer, Ronald Reagan, Sam Altman, self-driving car, Silicon Valley, Silicon Valley startup, smart meter, Snapchat, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, supervolcano, technoutopianism, The Wealth of Nations by Adam Smith, traffic fines, Travis Kalanick, urban sprawl, Watson beat the top human players on Jeopardy!, Y Combinator, Y2K, yield curve
Google spelled out its corporate logo in mirrors at the giant solar station in the Mojave Desert on the day it announced that it would power every last watt of its global business with renewable energy; it’s the world’s biggest corporate purchaser of green power.2 But there is exactly one human being who bridges that cultural gulf between these different species of plutocrat. Vanity Fair, in 2016, declared that Ayn Rand was “perhaps the most influential figure in the tech industry.” Steve Wozniak (cofounder of Apple) said that Steve Jobs (deity) considered Atlas Shrugged one of his guides in life.3 Elon Musk (also a deity, and straight out of a Rand novel, with his rockets and hyperloops and wild cars) says Rand “has a fairly extreme set of views, but she has some good points in there.”4 That’s as faint as the praise gets. Travis Kalanick, who founded Uber, used the cover of The Fountainhead as his Twitter avatar.
“If you have a home with a teenage son,” he writes, “you can conduct your own experiment. Provide him with a minimum subsidy of Coke and pizza, and then remove all demands for work and all parental supervision. The likely outcome is that he will remain in his room for days, glued to the screen. He won’t do any homework or housework, will skip school, skip meals, and even skip showers and sleep. Yet he is unlikely to suffer from boredom or a sense of purposelessness.”8 Steve Wozniak, cofounder of Apple, predicts that robots will graciously take us on as pets so we can “be taken care of all the time.”9 He added that he was now feeding his dog filet mignon, on the principle of “do unto others.” None of that is why we’re developing artificial intelligence. (We’re developing it to make money, one business at a time.) But that is what many of the people who look closely at it think may happen.
Ed Regis, The Great Mambo Chicken and the Transhuman Condition (New York: Basic, 1990), p. 167. 5. Tim Urban, “The AI Revolution: The Road to Superintelligence,” Huffington Post, February 10, 2015. 6. Ibid. 7. Decca Aitkenhead, “James Lovelock: Before the End of This Century, Robots Will Have Taken Over,” Guardian, September 30, 2016. 8. Yuval Harari, “The Meaning of Life in a World without Work,” Guardian, May 8, 2017. 9. Samuel Gibbs, “Apple Co-founder Steve Wozniak Says Humans Will Be Robots’ Pets,” Guardian, June 25, 2015. 10. Paul Lewis, “‘Our Minds Can Be Hijacked’: The Tech Insiders Who Fear a Smartphone Dystopia,” Guardian, October 6, 2017. 11. Lanier, Ten Arguments, p. 18. 12. Sang In Jung et al., “The Effect of Smartphone Usage Time on Posture and Respiratory Function,” Journal of Physical Therapy Science 28, no. 1 (January 2016). 13.
So Good They Can't Ignore You: Why Skills Trump Passion in the Quest for Work You Love by Cal Newport
Apple II, bounce rate, business cycle, Byte Shop, Cal Newport, capital controls, cleantech, Community Supported Agriculture, deliberate practice, financial independence, follow your passion, Frank Gehry, information asymmetry, job satisfaction, job-hopping, knowledge worker, Mason jar, medical residency, new economy, passive income, Paul Terrell, popular electronics, renewable energy credits, Results Only Work Environment, Richard Bolles, Richard Feynman, rolodex, Sand Hill Road, side project, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, web application, winner-take-all economy
In 1974, after Jobs’s return from India, a local engineer and entrepreneur named Alex Kamradt started a computer time-sharing company dubbed Call-in Computer. Kamradt approached Steve Wozniak to design a terminal device he could sell to clients to use for accessing his central computer. Unlike Jobs, Wozniak was a true electronics whiz who was obsessed with technology and had studied it formally at college. On the flip side, however, Wozniak couldn’t stomach business, so he allowed Jobs, a longtime friend, to handle the details of the arrangement. All was going well until the fall of 1975, when Jobs left for the season to spend time at the All-One commune. Unfortunately, he failed to tell Kamradt he was leaving. When he returned, he had been replaced. I tell this story because these are hardly the actions of someone passionate about technology and entrepreneurship, yet this was less than a year before Jobs started Apple Computer. In other words, in the months leading up to the start of his visionary company, Steve Jobs was something of a conflicted young man, seeking spiritual enlightenment and dabbling in electronics only when it promised to earn him quick cash.
Now that we know what to look for, this transactional interpretation of compelling careers becomes suddenly apparent. Consider Steve Jobs. When Jobs walked into Paul Terrell’s Byte Shop he was holding something that was literally rare and valuable: the circuit board for the Apple I, one of the more advanced personal computers in the fledgling market at the time. The money from selling a hundred units of that original design gave Jobs more control in his career, but in classic economic terms, to get even more valuable traits in his working life, he needed to increase the value of what he had to offer. It’s at this point that Jobs’s ascent begins to accelerate. He takes on $250,000 in funding from Mark Markkula and works with Steve Wozniak to produce a new computer design that is unambiguously too good to be ignored. There were other engineers in the Bay Area’s Homebrew Computer Club culture who could match Jobs’s and Wozniak’s technical skill, but Jobs had the insight to take on investment and to focus this technical energy toward producing a complete product.
It was around the time I was transitioning from graduate school that I started to pull on these threads, eventually leading to my complete rejection of the passion hypothesis and kicking off my quest to find out what really matters for creating work you love. Rule #1 is dedicated to laying out my argument against passion, as this insight—that “follow your passion” is bad advice—provides the foundation for everything that follows. Perhaps the best place to start is where we began, with the real story of Steve Jobs and the founding of Apple Computer. Do What Steve Jobs Did, Not What He Said If you had met a young Steve Jobs in the years leading up to his founding of Apple Computer, you wouldn’t have pegged him as someone who was passionate about starting a technology company. Jobs had attended Reed College, a prestigious liberal arts enclave in Oregon, where he grew his hair long and took to walking barefoot. Unlike other technology visionaries of his era, Jobs wasn’t particularly interested in either business or electronics as a student.
Tools for Thought: The History and Future of Mind-Expanding Technology by Howard Rheingold
Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, card file, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, conceptual framework, Conway's Game of Life, Douglas Engelbart, Dynabook, experimental subject, Hacker Ethic, Howard Rheingold, interchangeable parts, invention of movable type, invention of the printing press, Jacquard loom, John von Neumann, knowledge worker, Marshall McLuhan, Menlo Park, Norbert Wiener, packet switching, pattern recognition, popular electronics, post-industrial society, RAND corporation, Robert Metcalfe, Silicon Valley, speech recognition, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, telemarketer, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture
A gap-toothed, crazy-eyed, full-bearded fellow who now writes software and stays away from illegal activities, Crunch traveled the highways in the late sixties and early seventies with a van full of electronic equipment, playing virtuoso pranks from roadside phone booths -- until he was caught, prosecuted, sentenced, and jailed. One of Crunch's phone hacking buddies from the outlaw days, Steve Wozniak, went on to bigger fame when he invented the first Apple computer. Captain Crunch, also known as John Draper, now makes very decent legitimate money as "Cap'n Software," the sole programmer for the microcomputer software company of the same name. At Project MAC, and at the subcultural counterparts at Stanford (where they began to blend some of their California brand of craziness into the hacker formula) and elsewhere, you had to suffer in order to be admitted to the more interesting levels of hacker wisdom.
It had been obvious from day one that a great many people wanted to have computers of their own. MITS had the usual problems associated with a successful start-up company. Roberts eventually sold it. In 1977, Commodore, Heathkit, and Radio Shack began marketing personal computers based on the interconnection method established by the Altair -- still known as the S100 bus. Steve Wozniak and Steve Jobs started selling Apples in 1977 and now are firmly established in the annals of Silicon Valley garage-workshop mythology -- the Hewlett and Packard of the seventies generation. Gates and Allen became Microsoft, Inc. Their company sold over $50 million worth of software to personal computer users in 1983. Microsoft is aiming for the hundred-million-dollar category, and Gates still has a couple more years before he reaches the age of thirty.
"A really good program designer makes an artist out of the person who uses the computer, by creating a world that puts them in the position of 'Here's the keyboard, and here's the screen. Now once you learn a few rudimentary computer skills, you can be a superstar.' " It was an unexpected, but perhaps not inappropriate philosophy to hear from a LISP hacker turned software vendor. He has yet to carve out an empire like Bill Gates or Steve Wozniak, but David Rodman knows that most of the potential consumers of microcomputer software are still in the earliest stages of their progression toward obsessive software intoxication. David sees a niche for people like himself as toolmakers and trailblazers, leading the way for the emergence of an entire population of programming artists. He wants programming to become a performing art. But long before hackers started thinking about using their computers for intellectual improvisation -- before David Rodman was born, in fact -- a dreamer out in California was designing his own kind of mind amplifier.
Quiet: The Power of Introverts in a World That Can't Stop Talking by Susan Cain
8-hour work day, Albert Einstein, Asperger Syndrome, Bill Gates: Altair 8800, call centre, crowdsourcing, David Brooks, delayed gratification, deliberate practice, game design, hive mind, index card, indoor plumbing, Isaac Newton, knowledge economy, knowledge worker, longitudinal study, Mahatma Gandhi, mass immigration, Menlo Park, meta analysis, meta-analysis, Mikhail Gorbachev, Nelson Mandela, new economy, popular electronics, Ralph Waldo Emerson, ride hailing / ride sharing, Rosa Parks, selective serotonin reuptake inhibitor (SSRI), shareholder value, Silicon Valley, Steve Jobs, Steve Wozniak, telemarketer, The Wisdom of Crowds, traveling salesman, twin studies, Walter Mischel, web application, white flight
But that night he goes home and sketches his first design for a personal computer, with a keyboard and a screen just like the kind we use today. Three months later he builds a prototype of that machine. And ten months after that, he and Steve Jobs cofound Apple Computer. Today Steve Wozniak is a revered figure in Silicon Valley—there’s a street in San Jose, California, named Woz’s Way—and is sometimes called the nerd soul of Apple. He has learned over time to open up and speak publicly, even appearing as a contestant on Dancing with the Stars, where he displayed an endearing mixture of stiffness and good cheer. I once saw Wozniak speak at a bookstore in New York City. A standing-room-only crowd showed up bearing their 1970s Apple operating manuals, in honor of all that he had done for them. But the credit is not Wozniak’s alone; it also belongs to Homebrew. Wozniak identifies that first meeting as the beginning of the computer revolution and one of the most important nights of his life.
Many of our most important civic institutions, from elections to jury trials to the very idea of majority rule, depend on dissenting voices. But when the group is literally capable of changing our perceptions, and when to stand alone is to activate primitive, powerful, and unconscious feelings of rejection, then the health of these institutions seems far more vulnerable than we think. But of course I’ve been simplifying the case against face-to-face collaboration. Steve Wozniak collaborated with Steve Jobs, after all; without their pairing, there would be no Apple today. Every pair bond between mother and father, between parent and child, is an act of creative collaboration. Indeed, studies show that face-to-face interactions create trust in a way that online interactions can’t. Research also suggests that population density is correlated with innovation; despite the advantages of quiet walks in the woods, people in crowded cities benefit from the web of interactions that urban life offers.
We also need to create settings in which people are free to circulate in a shifting kaleidoscope of interactions, and to disappear into their private workspaces when they want to focus or simply be alone. Our schools should teach children the skills to work with others—cooperative learning can be effective when practiced well and in moderation—but also the time and training they need to deliberately practice on their own. It’s also vital to recognize that many people—especially introverts like Steve Wozniak—need extra quiet and privacy in order to do their best work. Some companies are starting to understand the value of silence and solitude, and are creating “flexible” open plans that offer a mix of solo workspaces, quiet zones, casual meeting areas, cafés, reading rooms, computer hubs, and even “streets” where people can chat casually with each other without interrupting others’ workflow. At Pixar Animation Studios, the sixteen-acre campus is built around a football-field-sized atrium housing mailboxes, a cafeteria, and even bathrooms.
Built for Growth: How Builder Personality Shapes Your Business, Your Team, and Your Ability to Win by Chris Kuenne, John Danner
Airbnb, Amazon Web Services, Berlin Wall, Bob Noyce, business climate, call centre, cloud computing, disruptive innovation, don't be evil, Fall of the Berlin Wall, Gordon Gekko, Jeff Bezos, Kickstarter, Lean Startup, Mark Zuckerberg, pattern recognition, risk tolerance, Sand Hill Road, self-driving car, Silicon Valley, Steve Jobs, Steve Wozniak, supply-chain management, zero-sum game
Most importantly, we hope that as you close the final page, you will feel equipped to become a stronger builder for growth. A Builder Personality Tour through Silicon Valley Let’s see these personalities in action by taking a quick spin through the heart of Silicon Valley. Take a look at how each Builder Personality has shaped the structure, growth trajectory, and culture of four iconic companies.7 Apple’s Driver Our first stop is on everybody’s short list of startup-to-standout success stories. Leaving aside how revolutionary Apple 1.0 was in transforming the computer industry when Steve Wozniak and Steve Jobs got started, consider the stamp the prodigal Jobs himself left on this company after his return in 1997 to spearhead the stunning revival of the company. His personality—born to build, an intuitive decision maker, with a controlling and often abrasive management style—shaped that company’s entire destiny.
Starting a business, much less building it into a large and durable enterprise, is never easy, and it’s often a lonely endeavor. It’s no surprise many builders—perhaps you included—choose to embark on that adventure with cofounders. That’s a decision that immediately puts the issue of Builder Personality front and center—for both of you. Just take a look at this partial list of cobuilders: Apple: Steve Jobs and Steve Wozniak Microsoft: Bill Gates and Paul Allen Ben & Jerry’s: Ben Cohen and Jerry Greenfield Intel: Gordon Moore and Bob Noyce P & G: William Procter and James Gamble Airbnb: Nathan Blecharczyk, Brian Chesky, and Joe Gebbia Google: Sergey Brin and Larry Page Rent the Runway: Jenn Hyman and Jenny Fleiss Warby Parker: Neil Blumenthal, Dave Gilboa, Andrew Hunt, and Jeffrey Raider Pinterest: Ben Silbermann, Evan Sharp, and Paul Sciarra Eventbrite: Julia Hartz and Kevin Hartz HP: Bill Hewlett and Dave Packard These builder partnerships cut across industry, geographic, gender, and cultural lines.
His personality—born to build, an intuitive decision maker, with a controlling and often abrasive management style—shaped that company’s entire destiny. Jobs was a Driver’s Driver—relentless in lashing his company to his singular vision of “insanely great products” he knew the world (and his customers) needed, even before they did. This kind of market-sensing capability, propelled by an obsessive drive to launch the perfect market-fitting product, typifies the Builder Personality we call the Driver. Apple under Jobs’s direction was not a “we’re a big family” organization. It was—and is—a proud, defiant, and famously secretive place under the spotlight scrutiny of its brilliant, if sometimes mercurial, founder-CEO. Its fusion of beautiful design simplicity, functional technology, and innovative business models continues to reflect the transformative power a Driver can have—even when he or she sees way beyond everyone else’s headlights.
The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley by Leslie Berlin
Apple II, Bob Noyce, business cycle, collective bargaining, computer age, George Gilder, informal economy, John Markoff, Kickstarter, laissez-faire capitalism, low skilled workers, means of production, Menlo Park, Murray Gell-Mann, open economy, Richard Feynman, Ronald Reagan, Sand Hill Road, Silicon Valley, Silicon Valley startup, Steve Jobs, Steve Wozniak, union organizing, War on Poverty, women in the workforce, Yom Kippur War
This certainly was true of Apple Computer, which was financed by men associated with Fairchild and Intel and staffed with many people from Hewlett-Packard and Intel.46 Apple had gotten its start in 1976, when 19-year-old Jobs convinced his friend Steve Wozniak, who had developed a personal computer in his garage, to start a business with him. The two showed their computer to venture capitalist Don Valentine (a former Fairchild salesman), who suggested they contact Mike Markkula, recently retired (at age 34) from his job in Intel’s marketing group. Markkula, who had long dreamed of something like a personal computer—as a teenager, he had built a “programmable electronic sliderule”—invested $91,000 in the company. In exchange, he received a one-third ownership stake in Apple.47 One of Markkula’s first calls on behalf of Apple was to Noyce. “I want you to be aware of this,” Markkula said. “I’d like to present to the [Intel] board.” Noyce gave his approval and on the appointed day, Markkula and Steve Wozniak gave a presentation about the personal computer, an Apple II on hand for demonstration purposes.
But he was interesting enough to talk to, and soon Bowers found herself engrossed in what she called “all Steve’s schemes,” only half of which she thought were even remotely feasible. Clearly this was a company that needed her help. She agreed to consult for Apple.49 A few months into her consulting work, Bowers learned that Steve Wozniak wanted to sell some of his founders’ stock for $13 a share. She bought it from him. “Bob thought I was nuts,” she recalls. Noyce did not try to stop her from investing—they had long ago agreed that she could do what she liked with her money, and he could do the same with his—but he could not take Jobs and Wozniak seriously. Even Arthur Rock admits, “Steve Jobs and Steve Wozniak weren’t very appealing people in those days.” Wozniak was the telephone-era’s version of a hacker—he used a small box that emitted electronic tones to call around the world for free—and Steve Jobs’s ungroomed appearance was offputting to Noyce.
Jobs thought that “Bob was the soul of Intel,” and Jobs wanted, he said, “to smell that second wonderful era of the valley, the semiconductor companies leading into the computer.”52 What did Noyce get out of the relationship? Jobs surmises, “Apple was probably the first Silicon Valley company that was widely known as a lifestyle company, the first that made a broad consumer product—and here I was, twenty-five [years old]. And for Bob, it was a bit of ‘What?! Who is this guy? What’s going on here?’ It was just a little strange for him. Bob might have been a little curious.” Renewal 253 Apple went public in December of 1980 at $22 a share. The offering netted Apple more than $100 million, roughly 14 times the proceeds Intel received from its IPO.53 Noyce had a front-row seat for all that transpired at Apple. Not only was he one of Jobs’s mentors and Markkula’s friends, but in August of 1980, Ann Bowers joined the company as the human resources vice president. Apple burst with the young-company spirit and hunger that Noyce adored, and every once in a while, Noyce would, as Markkula put it, “come over to Apple and just hang around.
Growth Hacker Marketing: A Primer on the Future of PR, Marketing, and Advertising by Ryan Holiday
Airbnb, iterative process, Kickstarter, Lean Startup, Marc Andreessen, market design, minimum viable product, Paul Graham, pets.com, post-work, Silicon Valley, slashdot, Steve Wozniak, Travis Kalanick
But we don’t simply set up viral features and hope they work. Keeping our growth engine going is a step unto itself. We must dive deeply into the analytics available to us and refine, refine, refine until we get maximum results. STEP 4 Close the Loop: Retention and Optimization You need the kind of objectivity that makes you forget everything you’ve heard, clear the table, and do a factual study like a scientist would. —Steve Wozniak If the growth hacking process begins with something I would have previously considered to be outside the marketer’s domain (product development), then I suppose it is only natural that it conclude with another. The traditional marketer’s job, as I learned in my time in fashion and publishing, is to get the leads—to bring in potential customers. It’s someone else’s job to figure out what to do with them.
Dropbox, for instance, offered its customers a 150 megabyte storage bonus if they linked their Dropbox account to their Facebook or Twitter account. Think of Hotmail, whose early attempts at growth hacking we looked at earlier. It turned every e-mail its users sent into a pitch to new customers. Think of Apple and BlackBerry, which turned their devices into advertising engines by adding “Sent from my iPhone” or “Sent from my BlackBerry” to every message sent. (Apple’s best and most compelling public move, of course, was the decision to make its headphones white instead of black. Now the millions of people who’ve bought devices from Apple advertise them everywhere.) Now start-ups are following this lead. Mailbox, an in-box organizer, adds a “Sent from Mailbox” line to the end of its users’ e-mails. When I filed my taxes this year with TurboTax, it asked me if I wanted to send out a prewritten tweet that said I’d gotten a refund by using its service.
Dogfight: How Apple and Google Went to War and Started a Revolution by Fred Vogelstein
Apple II, Ben Horowitz, cloud computing, commoditize, disintermediation, don't be evil, Dynabook, Firefox, Google Chrome, Google Glasses, Googley, John Markoff, Jony Ive, Marc Andreessen, Mark Zuckerberg, Peter Thiel, pre–internet, Silicon Valley, Silicon Valley startup, Skype, software patent, spectrum auction, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Tim Cook: Apple, web application, zero-sum game
All of this captivated not just consumers but investors. A year after Jobs had unveiled the iPhone, Apple’s stock price had doubled. Apple helped create and then took full advantage of all the hype. On launch day it sent top executives to various stores in big cities to witness it all and help whip up the crowds. Head of Global Marketing Phil Schiller went to Chicago. Jony Ive and his design crew went to San Francisco. Steve Jobs’s store was, naturally, the one in downtown Palo Alto at the corner of University Avenue and Kipling Street. It was a mile and a half from his house and he often showed up there unannounced when he was in town. The appropriate high-tech luminaries had already gathered when he arrived. Apple cofounder Steve Wozniak and early Apple employees Bill Atkinson and Andy Hertzfeld were already standing on line.
Jobs was particularly satisfied with this development, a confidant said—even though in the context of the other upheavals the iPad was unleashing it was almost a footnote. Thirty-five years after starting Apple with Steve Wozniak, Jobs was finally doing what he had set out to do all along: he was transforming what consumers and businesses expected from their computers. The Macintosh in 1984—the first mainstream machine to use a mouse—was supposed to have been the machine that did this. It was supposed to have taken a complicated device—the PC—and made it a consumer product that anyone could use. That failed. As everyone knows, Macs didn’t go away, but Microsoft Windows and Office get the credit for making the PC mainstream. Yet by 2011 the world had come full circle. If you counted desktop and mobile operating systems together, Apple’s computing platform was now about as big as Microsoft Windows and Windows Mobile.
Almost all the media coverage focused on Apple’s unreasonable and possibly unlawful control over its app store, portraying Jobs as a power-mad despot. In an effort not to look despotic, Apple tried to lead journalists into concluding that AT&T, not Apple, was behind all the rejections. But that made things even worse. It made the FCC wonder if Apple and AT&T were in some kind of improper collusion. Two months later, in response to Freedom of Information Act requests by the media, the FCC released its correspondence with the three companies. It did not make Apple look good. Google’s letter said, “Apple representatives informed Google that Google Voice was rejected because Apple believed the application duplicated the core dialer functionality of the iPhone. The Apple representatives indicated that the company did not want applications that could potentially replace such functionality.”
The Four: How Amazon, Apple, Facebook, and Google Divided and Conquered the World by Scott Galloway
activist fund / activist shareholder / activist investor, additive manufacturing, Affordable Care Act / Obamacare, Airbnb, Amazon Web Services, Apple II, autonomous vehicles, barriers to entry, Ben Horowitz, Bernie Sanders, big-box store, Bob Noyce, Brewster Kahle, business intelligence, California gold rush, cloud computing, commoditize, cuban missile crisis, David Brooks, disintermediation, don't be evil, Donald Trump, Elon Musk, follow your passion, future of journalism, future of work, global supply chain, Google Earth, Google Glasses, Google X / Alphabet X, Internet Archive, invisible hand, Jeff Bezos, Jony Ive, Khan Academy, longitudinal study, Lyft, Mark Zuckerberg, meta analysis, meta-analysis, Network effects, new economy, obamacare, Oculus Rift, offshore financial centre, passive income, Peter Thiel, profit motive, race to the bottom, RAND corporation, ride hailing / ride sharing, risk tolerance, Robert Mercer, Robert Shiller, Robert Shiller, Search for Extraterrestrial Intelligence, self-driving car, sentiment analysis, shareholder value, Silicon Valley, Snapchat, software is eating the world, speech recognition, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Stewart Brand, supercomputer in your pocket, Tesla Model S, Tim Cook: Apple, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, undersea cable, Whole Earth Catalog, winner-take-all economy, working poor, young professional
Here’s a list of the source of wealth for the ten richest people in Europe (who cares who they are, their companies are infinitely more interesting than they are): Zara L’Oréal H&M LVMH Nutella Aldi Lidl Trader Joe’s Luxottica Crate & Barrel12 The Luxury of Time No technology firm has solved the problem of aging—losing relevance. As a luxury brand, Apple is the first technology company to have a shot at multigenerational success. Apple did not start as a luxury brand. It was the best house in a shitty neighborhood, tech hardware. A world of cables, geekware, acronyms, and low margins. In the early days, Apple simply made a more intuitive computer than its competitors. Steve Jobs’s notions about elegant packaging only appealed to a minority of customers; it was Steve Wozniak’s architecture that drew the rest. Back then, the company appealed largely to consumers’ brains. Many early Apple lovers were geeks (which did nothing for its sex appeal). Apple, to its credit, gazed across the tracks at luxury town and thought: Why not? Why can’t we be the best house in the best neighborhood?
As if he sat at a lab table in the R&D department at Apple headquarters in Cupertino and soldered chips on a tiny motherboard . . . until boom! he gave the world the iPod. Actually, that was Steve Wozniak with the Apple 1 a quarter century before. Steve Jobs was a genius—but his gifts lay elsewhere. And nowhere was that genius more visible than when business experts everywhere were proclaiming the “disintermediation” of tech—the disappearance of the physical distribution and retail channels as they were replaced by the virtualization of e-commerce. Jobs understood, as none of his peers did, that whereas content, even commodity products, might be sold online, if you wanted to sell electronics hardware as premium-priced luxury items, you had to sell them like other luxury items. That is, in shining temples, under brilliant lights, with ardent young “genius” salespeople at your beck and call.
Price, Rob. “Apple is taking 92% of profits in the entire smartphone industry.” Business Insider. July 13, 2015. http://www.businessinsider.com/apple-92-percent-profits-entire-smartphone-industry-q1-samsung-2015-7. 20. “Louis Vuitton Biography.” Biography. http://www.biography.com/people/louis-vuitton-17112264. 21. Apple Newsroom. “‘Designed by Apple in Calfornia’ chronicles 20 years of Apple design.” https://www.apple.com/newsroom/2016/11/designed-by-apple-in-california-chronicles-20-years-of-apple-design/. 22. Ibid. 23. Norman, Don. Emotional Design: Why We Love (or Hate) Everyday Things (New York: Basic Books, 2005). 24. Turner, Daniel. “The Secret of Apple Design.” MIT Technology Review, May 1, 2007. https://www.technologyreview.com/s/407782/the-secret-of-apple-design/. 25.
The New Geography of Jobs by Enrico Moretti
assortative mating, Bill Gates: Altair 8800, business climate, call centre, cleantech, cloud computing, corporate raider, creative destruction, desegregation, Edward Glaeser, financial innovation, global village, hiring and firing, income inequality, industrial cluster, Jane Jacobs, Jeff Bezos, Joseph Schumpeter, knowledge economy, labor-force participation, low skilled workers, manufacturing employment, Mark Zuckerberg, mass immigration, medical residency, Menlo Park, new economy, peer-to-peer lending, Peter Thiel, Productivity paradox, Richard Florida, Sand Hill Road, Silicon Valley, Skype, special economic zone, Startup school, Steve Jobs, Steve Wozniak, thinkpad, Tyler Cowen: Great Stagnation, Wall-E, Y Combinator, zero-sum game
On top of that, the assembly and manufacture of many of the parts have moved abroad, just as we saw in the case of the iPhone. The first batch of two hundred Apple I computers was assembled by Steve Jobs and Steve Wozniak in Jobs’s famous garage in Los Altos in 1976. Production didn’t stray far for a few years. During the 1980s, Apple was manufacturing most of its Macs in a factory in Fremont, California. But in 1992 Apple shut down the factory and shifted production first to cheaper parts of California and Colorado, then to Ireland and Singapore. All other American companies followed the model. As James Fallows once put it, “Everyone in America has heard of Dell, Sony, Compaq, HP, Lenovo-IBM ThinkPad, Apple, NEC, Gateway, Toshiba. Almost no one has heard of Quanta, Compal, Inventec, Wistron, Asustek. Yet nearly 90 percent of laptops and notebooks sold under the famous brand names are actually made by one of these five companies in their factories in mainland China.”
Initially thousands of small producers were scattered across the country. A few decades later the number dropped to three giant corporations, with most of the production near Detroit. Today car factories are again spread all over the world, from Brazil to Poland. When personal computers first appeared in the 1970s, a myriad of small independent producers were scattered all over America. Steve Jobs and Steve Wozniak made the first Apple computer in 1976 by buying components from a mail-order catalogue and assembling them in Jobs’s garage. Later the production of personal computers became a highly concentrated industry, with just a few key players, mostly in Silicon Valley. Right now the industry is maturing, and production is scattered among hundreds of low-cost locations. The same pattern has been documented in industries as diverse as iron founding, flour milling, and cigarette production.
Twenty-five million of these containers leave the port each year, almost one per second. In less than two weeks that merchandise will be on a truck headed for a Walmart distribution center, an IKEA warehouse, or an Apple store. Shenzhen is where the iPhone is assembled. If there is a poster child of globalization, it is the iPhone. Apple has given as much attention to designing and optimizing its supply chain as to the design of the phone itself. The process by which the iPhone is produced illustrates how the new global economy is reshaping the location of jobs and presenting new challenges for American workers. Apple engineers in Cupertino, California, conceived and designed the iPhone. This is the only phase of the production process that takes place entirely in the United States. It involves product design, software development, product management, marketing, and other high-value functions.
Lab Rats: How Silicon Valley Made Work Miserable for the Rest of Us by Dan Lyons
Airbnb, Amazon Web Services, Apple II, augmented reality, autonomous vehicles, basic income, bitcoin, blockchain, business process, call centre, Clayton Christensen, clean water, collective bargaining, corporate governance, corporate social responsibility, creative destruction, cryptocurrency, David Heinemeier Hansson, Donald Trump, Elon Musk, Ethereum, ethereum blockchain, full employment, future of work, gig economy, Gordon Gekko, greed is good, hiring and firing, housing crisis, income inequality, informal economy, Jeff Bezos, job automation, job satisfaction, job-hopping, John Gruber, Joseph Schumpeter, Kevin Kelly, knowledge worker, Lean Startup, loose coupling, Lyft, Marc Andreessen, Mark Zuckerberg, McMansion, Menlo Park, Milgram experiment, minimum viable product, Mitch Kapor, move fast and break things, move fast and break things, new economy, Panopticon Jeremy Bentham, Paul Graham, paypal mafia, Peter Thiel, plutocrats, Plutocrats, precariat, RAND corporation, remote working, RFID, ride hailing / ride sharing, Ronald Reagan, Rubik’s Cube, Ruby on Rails, Sam Altman, Sand Hill Road, self-driving car, shareholder value, Silicon Valley, Silicon Valley startup, six sigma, Skype, Social Responsibility of Business Is to Increase Its Profits, software is eating the world, Stanford prison experiment, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, TaskRabbit, telemarketer, Tesla Model S, Thomas Davenport, Tony Hsieh, Toyota Production System, traveling salesman, Travis Kalanick, tulip mania, Uber and Lyft, Uber for X, uber lyft, universal basic income, web application, Whole Earth Catalog, Y Combinator, young professional
By the 1970s, HP was a thriving organization that many in Silicon Valley (and beyond) wanted to emulate. Apple co-founder Steve Wozniak, who worked as an engineer at HP in the 1970s, later recalled: “We had such great camaraderie. We were so happy. Almost everyone spoke about it as the greatest company you could ever work for.” The 1970s brought another element to Silicon Valley—the idealistic values of the counterculture. “Power to the people” was the slogan of 1960s, and it was also the motto of the people who led the personal computer revolution in the 1970s. Instead of sharing a mainframe, which was controlled by Big Brother, everyone could have their own computer. This was an incredibly radical idea, with huge implications for society. Wozniak and his Apple co-founder Steve Jobs were long-haired hippie-hackers who built their first personal computers as members of the Homebrew Computer Club, a pack of amateur kit-computer hobbyists.
It doesn’t seem to make sense that the same idealistic, altruistic wizards who design beautiful products and deliver exquisite user experiences—who create so much delightion, as my colleagues at HubSpot would say—should cause so much misery. Yet that’s what is happening. Apple makes terrific smartphones and provides world-class customer support, but the company also has dodged taxes using a scheme that Nobel laureate economist Joseph Stiglitz once called a “fraud.” Amazon Prime is an amazing service, but Amazon abuses workers in its headquarters and warehouses. Customers love Uber, but Uber operates a toxic workplace and exploits its drivers. Tesla makes very sexy electric cars, but by many accounts, Elon Musk behaves abominably toward his employees and has earned a reputation for being less than forthcoming with customers. “I don’t believe anything Elon Musk or Tesla says,” Apple co-founder Steve Wozniak, a disappointed Tesla owner, said in 2018. In the past few years I’ve come to the uncomfortable conclusion that, for various reasons mostly related to greed, the very people in Silicon Valley who talk so much about making the world a better place are actually making it worse—at least when it comes to the well-being of workers.
BBC.com, June 21, 2018. https://www.bbc.com/news/technology-44561838. Chapter 8: Change: “What Happens If You Live Inside a Hurricane That Never Ends?” Aouf, Rima Sabina. “Apple Park Employees Revolt over Having to Work in Open-Plan Offices.” Dezeen, August 10, 2017. https://www.dezeen.com/2017/08/10/apple-park-campus-employees-rebel-over-open-plan-offices-architecture-news. Bray, Chad. “No Laptop, No Phone, No Desk: UBS Reinvents the Work Space.” New York Times, November 3, 2016. https://www.nytimes.com/2016/11/04/business/dealbook/ubs-bank-virtual-desktops-london.html. Broussard, Mitchel. “Some Apple Park Employees Said to Be Dissatisfied with Open Office Design.” MacRumors, August 9, 2017. https://www.macrumors.com/2017/08/09/apple-park-employees-open-office. Bruch, Heike, and Jochen I. Menges. “The Acceleration Trap.” Harvard Business Review, April 2010 Issue. https://hbr.org/2010/04/the-acceleration-trap.
Where Good Ideas Come from: The Natural History of Innovation by Steven Johnson
Ada Lovelace, Albert Einstein, Alfred Russel Wallace, carbon-based life, Cass Sunstein, cleantech, complexity theory, conceptual framework, cosmic microwave background, creative destruction, crowdsourcing, data acquisition, digital Maoism, digital map, discovery of DNA, Dmitri Mendeleev, double entry bookkeeping, double helix, Douglas Engelbart, Douglas Engelbart, Drosophila, Edmond Halley, Edward Lloyd's coffeehouse, Ernest Rutherford, Geoffrey West, Santa Fe Institute, greed is good, Hans Lippershey, Henri Poincaré, hive mind, Howard Rheingold, hypertext link, invention of air conditioning, invention of movable type, invention of the printing press, invention of the telephone, Isaac Newton, Islamic Golden Age, James Hargreaves, James Watt: steam engine, Jane Jacobs, Jaron Lanier, Johannes Kepler, John Snow's cholera map, Joseph Schumpeter, Joseph-Marie Jacquard, Kevin Kelly, lone genius, Louis Daguerre, Louis Pasteur, Mason jar, mass immigration, Mercator projection, On the Revolutions of the Heavenly Spheres, online collectivism, packet switching, PageRank, patent troll, pattern recognition, price mechanism, profit motive, Ray Oldenburg, Richard Florida, Richard Thaler, Ronald Reagan, side project, Silicon Valley, silicon-based life, six sigma, Solar eclipse in 1919, spinning jenny, Steve Jobs, Steve Wozniak, Stewart Brand, The Death and Life of Great American Cities, The Great Good Place, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, transaction costs, urban planning
Even as much of the high-tech culture has embraced decentralized, liquid networks in their approach to innovation, the company that is consistently ranked as the most innovative in the world—Apple—remains defiantly top-down and almost comically secretive in its development of new products. You won’t ever see Steve Jobs or Jonathan Ive crowdsourcing development of the next-generation iPhone. If open and dense networks lead to more innovation, how can we explain Apple, which on the spectrum of openness is far closer to Willy Wonka’s factory than it is to Wikipedia? The easy answer is that Jobs and Ive simply possess a collaborative genius that has enabled the company to ship such a reliable stream of revolutionary products. No doubt both men are immensely talented at what they do, but neither of them can design, build, program, and market a product as complex as the iPhone on their own, the way Jobs and Steve Wozniak crafted the original Apple personal computer in the now-legendary garage.
ENDORPHINS (1975) Discovered at about the same time by two research teams working independently, endorphins were first described when American scientist John Hughes and German-born British biologist Hans Kosterlitz published their results of a study in which they removed an amino-acid molecule from the brain of a pig, which they believed would bolster investigations of the brain’s receptors for morphine. PERSONAL COMPUTER (1976) Legendarily working out of a garage, entrepreneurs and college dropouts Steve Wozniak and Steve Jobs designed one of the first personal computers, or microcomputer—Apple I—in 1976, creating the first single-circuit board computer, though many important models, including the Altair, preceded it. ONCOGENES (1976) Bolstering the understanding of cancer and how malignant tumors are created, American immunobiologist J. Michael Bishop and cellular biologist Harold Varmus discovered the first human oncogene in 1970. RNA SPLICING (1977) British biochemist Richard J.
No doubt both men are immensely talented at what they do, but neither of them can design, build, program, and market a product as complex as the iPhone on their own, the way Jobs and Steve Wozniak crafted the original Apple personal computer in the now-legendary garage. Apple clearly has unparalleled leadership, but there must also be something in the environment at Apple that is allowing such revolutionary ideas to make it to the marketplace. As it turns out, while Apple has largely adopted a fortress mentality toward the outside world, the company’s internal development process is explicitly structured to facilitate clash and connection between different perspectives. Jobs himself has taken to describing their method via the allegory of the concept car. You go to an auto show and see some glamorous and wildly innovative concept car on display and you think, “I’d buy that in a second.” And then five years later, the car finally comes to market and it’s been whittled down from a Ferrari to a Pinto—all the truly breakthrough features have been toned down or eliminated altogether, and what’s left looks mostly like last year’s model.
100 Plus: How the Coming Age of Longevity Will Change Everything, From Careers and Relationships to Family And by Sonia Arrison
23andMe, 8-hour work day, Albert Einstein, Anne Wojcicki, artificial general intelligence, attribution theory, Bill Joy: nanobots, bioinformatics, Clayton Christensen, dark matter, disruptive innovation, East Village, en.wikipedia.org, epigenetics, Frank Gehry, Googley, income per capita, indoor plumbing, Jeff Bezos, Johann Wolfgang von Goethe, Kickstarter, Law of Accelerating Returns, life extension, personalized medicine, Peter Thiel, placebo effect, post scarcity, Ray Kurzweil, rolodex, Silicon Valley, Simon Kuznets, Singularitarianism, smart grid, speech recognition, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Levy, Thomas Malthus, upwardly mobile, World Values Survey, X Prize
As long as starting a business remains relatively easy, small businesses, which can turn into big businesses, are one avenue for young people looking to build experience. And in a world where technology will continue to help level playing fields, businesses that shun smart, young people will face competitive disadvantages. There are many examples of young people starting small companies that grew into larger, very successful companies. For example, Steve Jobs and Steve Wozniak started Apple in their early twenties—the same age as Bill Gates and Paul Allen when they started Microsoft—and Sergey Brin and Larry Page launched Google in their mid-twenties. GOING LONG ON AMBITION The degree to which longevity will change the economic order also depends somewhat on how much of an effect an increased health span has on the conditions of human ambition. A strong source of motivation, ambition affects individuals and societies on a large scale.
”62 The cool kids in the biohacking world, many of whom have competed at the international genetically engineered machine competition, are not only watching Dr. Venter with great interest; they are also working to create their own projects. Making cells blink, glow, or smell like banana is what many DIY bio types have on their minds, and such frivolous pursuits are reminiscent of the beginnings of the personal computer revolution. Back in the 1970s, it was the Homebrew Club that brought together clever thinkers—such as future Apple founders Steve Jobs and Steve Wozniak—to trade parts, circuits, and information for DIY computing devices. The point is that biology has become the latest and greatest engineering project, one that hobbyists celebrate. More importantly, eventually this passion will change the world. Those who have already made it big in the technology industry have not failed to notice. Aside from Bill Gates and Jeff Bezos, other tech titans who are driving interest in the longevity meme include Oracle’s Larry Ellison, PayPal cofounder Peter Thiel, Google’s Larry Page and Sergey Brin, and Microsoft cofounder Paul Allen.
See Artificial intelligence AIDS Airlines Alchemists Alcohol consumption Alexander the Great Algae Alginate hydrogel Allen, Paul Allen, Woody Allen Institute for Brain Science (Seattle) Alm, Richard Alzheimer’s Ambition American Council on Science and Health American Federation of Labor-Congress of Industrial Organizations Ames, Bruce Anatomy of Love (Fisher) Angola Annas, George Antibiotics Apple Inc. Aquinas, St. Thomas Archimedes Archon Genomics X PRIZE Aristotle Armed Forces Institute of Regenerative Medicine (AFIRM) Arnett, Dr. Jeffrey Jensen Art Artemisinin Arteriocyte company Artificial intelligence (AI) Artificial life Asian American females Asimov, Isaac Association of Medical Practitioners and Dentists (Italy) Astrology Atala, Dr. Anthony AT&T Atheists/agnostics(fig.)
Game Over Press Start to Continue by David Sheff, Andy Eddy
affirmative action, air freight, Alexey Pajitnov wrote Tetris, Apple II, Apple's 1984 Super Bowl advert, Buckminster Fuller, game design, HyperCard, inventory management, James Watt: steam engine, Jaron Lanier, Marshall McLuhan, Mikhail Gorbachev, pattern recognition, profit motive, revision control, Ronald Reagan, Silicon Valley, Steve Jobs, Steve Wozniak
The company’s legal department, however, was not among them, and the game never made it out the door. Steve Wozniak came over to Atari to help Jobs build another “Pong”-based game for Bushnell called “Breakout.” A paddle hit a ball against a wall of bricks that disappeared, one by one, when hit, until there were none left. Bushnell liked the game, but the circuitry required too many expensive computer chips. He offered Jobs a bonus of $100 for every chip he was able to eliminate. Jobs made himself $5,000. When they weren’t working their day jobs, Jobs and Wozniak were busy on their own, in the Jobs family garage. They built a makeshift computer—a circuit board, really—which they called the Apple I. Some of the parts had been lifted from Atari. The Apple I didn’t do much, but when Wozniak showed it off at a computer club meeting and the result was orders for fifty of the contraptions, it dawned on Jobs that there might actually be a market for personal computers, and he left Atari to found Apple.
“I made it with my own two hands and a soldering iron,” Bushnell says. He named it “Pong,” after the sonar-like “pongs” that sounded each time the ball made contact with the paddle. In the fall of 1972, Bushnell placed “Pong,” the first commercial video-arcade game, with a coin box bolted to the outside, in Andy Capp’s tavern, a popular Sunnyvale pool bar that holds a place in Silicon Valley lore rivaled only by the garage in which Steve Jobs and Steve Wozniak invented the Apple computer. Set beside a pinball machine, “Pong” was an oddity, a dark wood cabinet that held a black-and-white TV screen on which cavorted a white blip like a shooting star in a black sky. One of the bar’s patrons stood over the machine, examining it. “Avoid missing ball for high score,” read the only line of instructions. The young man reached into his pocket, extracted a quarter, and slipped it into a slot on the console as he called a friend over.
The company literally couldn’t afford the payroll twice one month. Don Valentine’s money had helped build up production, but the returns lagged. A big success that followed “Pong” bailed them out. It was the first video car-racing game that was controlled by a steering wheel attached to the cabinet. The game, “Gran Trak,” gobbled up quarters even faster than “Pong.” A friend of Steve Jobs, Steve Wozniak, an engineer at Hewlett Packard, was a “Gran Trak” addict. Most evenings after work he headed to a pub, where he put great quantities of quarters, money he could not afford, into “Gran Trak.” Jobs began to sneak him into Atari’s production facility at night, where he could play the game for free. In exchange for the free-game time, Woz, a whiz with computers, helped out whenever Jobs hit a stumbling block with some particularly tricky circuitry.
Kingpin: How One Hacker Took Over the Billion-Dollar Cybercrime Underground by Kevin Poulsen
Apple II, Brian Krebs, Burning Man, corporate governance, dumpster diving, Exxon Valdez, Hacker Ethic, hive mind, index card, Kickstarter, McMansion, Mercator projection, offshore financial centre, packet switching, pirate software, Ponzi scheme, Robert Hanssen: Double agent, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, traffic fines, web application, WikiLeaks, zero day, Zipcar
Pranks were a part of the hacker culture, and so was phone phreaking—the usually illegal exploration of the forbidden back roads of the telephone network. But hacking was above all a creative effort, one that would lead to countless watershed moments in computer history. The word “hacker” took on darker connotations in the early 1980s, when the first home computers—the Commodore 64s, the TRS-80s, the Apples—came to teenagers’ bedrooms in suburbs and cities around the United States. The machines themselves were a product of hacker culture; the Apple II, and with it the entire home computer concept, was born of two Berkeley phone phreaks named Steve Wozniak and Steve Jobs. But not all teenagers were content with the machines, and in the impatience of youth, they weren’t inclined to wait for grad school to dip into real processing power or to explore the global networks that could be reached with a phone call and the squeal of a modem.
See http://www.securityfocus.com/comments/articles/203/5729/threaded (May 24, 2001). Max says he did not consider himself an informant and only provided technical information. Chapter 4: The White Hat 1 The first people to identify themselves as hackers: The seminal work on the early hackers is Steven Levy, Hackers: Heroes of the Computer Revolution (New York: Anchor Press/Doubleday, 1984). Also see Steve Wozniak and Gina Smith, iWoz: From Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It (New York: W. W. Norton and Company, 2006). 2 Tim was at work one day: This anecdote was recalled by Tim Spencer. Max later recalled Spencer’s advice in a letter to his sentencing judge in Pittsburgh. 3 If there was one thing Max: Details of Max’s relationship with Kimi come primarily from interviews with Kimi. 4 Max went up to the city to visit Matt Harrigan: Harrigan’s business and his work with Max were described primarily by Harrigan, with some details confirmed by Max.
He met twenty-year-old Kimi Winters at a rave called Warmth, held on an empty warehouse floor in the city—Max had become a fixture in the rave scene, dancing with a surprising, fluid grace, whirling his arms like a Brazilian flame dancer. Kimi was a community college student and part-time barista. A foot shorter than Max, she sported an androgynous appearance in the shapeless black hoodie she liked to wear when she went out. But on a second look, she was decidedly cute, with apple cheeks and her Korean mother’s copper-tinted skin. Max invited Kimi to a party at his place. The parties at Hungry Manor were legendary, and when Kimi arrived the living room was already packed with dozens of party guests from Silicon Valley’s keyboard class—programmers, system administrators, and Web designers—mingling under the glass chandelier. Max lit up when he spotted her. He led her on a tour of the house, pointing out the geeky accoutrements the Hungry Programmers had added.
From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism by Fred Turner
1960s counterculture, A Declaration of the Independence of Cyberspace, Apple's 1984 Super Bowl advert, back-to-the-land, bioinformatics, Buckminster Fuller, business cycle, Claude Shannon: information theory, complexity theory, computer age, conceptual framework, Danny Hillis, dematerialisation, distributed generation, Douglas Engelbart, Douglas Engelbart, Dynabook, Electric Kool-Aid Acid Test, From Mathematics to the Technologies of Life and Death, future of work, game design, George Gilder, global village, Golden Gate Park, Hacker Ethic, Haight Ashbury, hive mind, Howard Rheingold, informal economy, invisible hand, Jaron Lanier, John Markoff, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, market bubble, Marshall McLuhan, mass immigration, means of production, Menlo Park, Mitch Kapor, Mother of all demos, new economy, Norbert Wiener, peer-to-peer, post-industrial society, postindustrial economy, Productivity paradox, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, Richard Stallman, Robert Shiller, Robert Shiller, Ronald Reagan, Shoshana Zuboff, Silicon Valley, Silicon Valley ideology, South of Market, San Francisco, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Ted Nelson, Telecommunications Act of 1996, The Hackers Conference, theory of mind, urban renewal, Vannevar Bush, Whole Earth Catalog, Whole Earth Review, Yom Kippur War
I wasn’t approaching it from either a theoretical point of view or an engineering point of view, but from sort of a fun-ness point of view.”59 According to Levy, this point of view characterized the work of two subsequent generations of innovators. The ﬁrst comprised the “hardware hackers” of the 1970s. Clustered in and around the San Francisco Bay area, they included the young founders of Apple Computer, Steve Jobs and Steve Wozniak, as well as early proselytizers for personal computing such as Lee Felsenstein, Bob Albrecht, and Ted Nelson, a programmer who had authored a volume loosely based on the Whole Earth Catalog entitled Computer Lib: You Can and Must Understand Computers Now. For this generation, Levy suggested, computing was a form of political rebellion. Computers may have always been large and centralized, they may have always been guarded by institutionalized experts, and they may have been used to organize the war in Vietnam, but this generation would put them to new uses
In particular, he suggested, they wanted to “witness or have the group articulate what the hacker ethic was.”63 Brand and Kelly aimed to explore via the conference whether hackers might constitute the sort of cultural vanguard for the 1980s that the back-to-the-land and ecology crowds had hoped to be for the decade before. Something like 150 hackers actually arrived. Among others, they included luminaries such as Steve Wozniak of Apple, Ted Nelson, free software pioneer Richard Stallman, and Ted Draper—known as Captain Crunch for his discovery that a toy whistle he found in a box of the cereal gave just the right tone to grant him free access to the phone system. Some of the hackers worked alone, part-time, at home; others represented such diverse institutions as MIT, Stanford, Lotus Development, and various software makers.
The foundation would, in addition, work “to convey to both the public and the policy-makers metaphors which will illuminate the more general stake in liberating Cyberspace.”84 The ﬁrst and most inﬂuential of the metaphors Barlow referred to was the “electronic frontier.”85 Being master networkers, Kapor and Barlow quickly gained press coverage of their new organization as well as offers of funding from Steve Wozniak, cofounder of Apple, and John Gilmore of Sun Microsystems. They started a conference on the WELL, and they recruited Stewart Brand, among others, to serve on their new organization’s board of directors. One evening in the early fall, Barlow convened a dinner in San Francisco attended by Brand, Jaron Lanier, Chuck Blanchard (who worked at VPL with Lanier), and Paul Saffo (head of the Institute for the Future, a Silicon Valley think tank).
Coders: The Making of a New Tribe and the Remaking of the World by Clive Thompson
2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, 4chan, 8-hour work day, Ada Lovelace, AI winter, Airbnb, Amazon Web Services, Asperger Syndrome, augmented reality, Ayatollah Khomeini, barriers to entry, basic income, Bernie Sanders, bitcoin, blockchain, blue-collar work, Brewster Kahle, Brian Krebs, Broken windows theory, call centre, cellular automata, Chelsea Manning, clean water, cloud computing, cognitive dissonance, computer vision, Conway's Game of Life, crowdsourcing, cryptocurrency, Danny Hillis, David Heinemeier Hansson, don't be evil, don't repeat yourself, Donald Trump, dumpster diving, Edward Snowden, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, Ethereum, ethereum blockchain, Firefox, Frederick Winslow Taylor, game design, glass ceiling, Golden Gate Park, Google Hangouts, Google X / Alphabet X, Grace Hopper, Guido van Rossum, Hacker Ethic, HyperCard, illegal immigration, ImageNet competition, Internet Archive, Internet of things, Jane Jacobs, John Markoff, Jony Ive, Julian Assange, Kickstarter, Larry Wall, lone genius, Lyft, Marc Andreessen, Mark Shuttleworth, Mark Zuckerberg, Menlo Park, microservices, Minecraft, move fast and break things, move fast and break things, Nate Silver, Network effects, neurotypical, Nicholas Carr, Oculus Rift, PageRank, pattern recognition, Paul Graham, paypal mafia, Peter Thiel, pink-collar, planetary scale, profit motive, ransomware, recommendation engine, Richard Stallman, ride hailing / ride sharing, Rubik’s Cube, Ruby on Rails, Sam Altman, Satoshi Nakamoto, Saturday Night Live, self-driving car, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, single-payer health, Skype, smart contracts, Snapchat, social software, software is eating the world, sorting algorithm, South of Market, San Francisco, speech recognition, Steve Wozniak, Steven Levy, TaskRabbit, the High Line, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, universal basic income, urban planning, Wall-E, Watson beat the top human players on Jeopardy!, WikiLeaks, women in the workforce, Y Combinator, Zimmermann PGP, éminence grise
a “hacker ethic”: Levy, Hackers, 26–37. “hacking in general”: Levy, Hackers, 107. tinker with the code: “GNU General Public License,” Free Software Foundation, June 29, 2007, accessed August 16, 2018, https://www.gnu.org/licenses/gpl-3.0.en.html. “bunch of other robots”: Levy, Hackers, 129. like the MIT hackers: Clive Thompson, “Steve Wozniak’s Apple I Booted Up a Tech Revolution,” Smithsonian, March 2016, accessed August 18, 2018, https://www.smithsonianmag.com/smithsonian-institution/steve-wozniaks-apple-i-booted-up-tech-revolution-180958112/. machine for $300: Philip H. Dougherty, “Commodore Computers Plans Big Campaign,” New York Times, February 18, 1982, accessed August 18, 2018, https://www.nytimes.com/1982/02/18/business/advertising-commodore-computers-plans-big-campaign.html. grasp and wield: Harry McCracken, “Fifty Years of BASIC, the Programming Language That Made Computers Personal,” Time, April 29, 2014, accessed August 18, 2018, http://time.com/69316/basic/; “BASIC Begins at Dartmouth,” Dartmouth, accessed August 18, 2018, https://www.dartmouth.edu/basicfifty/basic.html; Jimmy Maher, “In Defense of BASIC,” The Digital Antiquarian (blog), May 2, 2011, accessed August 18, 2018, https://www.filfre.net/2011/05/in-defense-of-basic.
“I spent my lifetime walking around talking like a robot, talking to a bunch of other robots,” as one of them later said with a sigh. By the ’80s, the nature of computers changed again. The devices were becoming cheaper and cheaper, as a new breed of manufacturer decided it was time to truly bring computers to the masses. Over in Silicon Valley in 1976, Steve Wozniak created the Apple I, one of the first computers that had a radical design element: It could plug into a regular TV. Turn it on, and you could immediately start coding, just like the MIT hackers. Soon plenty of other manufacturers began following Apple’s lead, driving the price of computers down to something a middle-class family could afford. In 1981, Commodore released the VIC-20, a plug-and-play machine for $300. The revolution begun by Wilkes had now spread to the wood-paneled basements of America. Suddenly, teenagers with enough money could essentially stumble into the world of programming.
See artificial intelligence (AI) Albright, Jonathan, ref1 Alciné, Jacky, ref1 algorithms, ref1, ref2 bias in ranking systems, ref1 scale and, ref1 algorithms challenge whiteboard interview, ref1, ref2, ref3 Algorithms of Oppression (Noble), ref1 Allen, Fran, ref1, ref2 Allen, Paul, ref1 AlphaGo, ref1, ref2 Altman, Sam, ref1, ref2 Amabile, Teresa M., ref1 Amazon, ref1, ref2, ref3 Amazons (board game), ref1 Amazon Web Services, ref1 Analytical Engine, ref1 Anderson, Tom, ref1 AND gate, ref1 Andreessen, Marc, ref1, ref2, ref3, ref4, ref5, ref6, ref7, ref8 Antisocial Media (Vaidhyanathan), ref1 Apple, ref1 Apple I, ref1 Apple iPhone, ref1, ref2 aptitude testing, ref1 architects, ref1 artificial intelligence (AI), ref1 dangers of, warnings about and debate over, ref1 de-biasing of, ref1 deep learning (See deep learning) edge cases and, ref1 expert systems, ref1 Hollywood depiction of, ref1 initial attempts to create, at Dartmouth in 1956, ref1 job listing sites, biased results in, ref1 justice system, effect of AI bias on, ref1 learning problem, ref1 neural nets (See neural nets) racism and sexism, learning of, ref1 artistic temperaments, ref1 Assembly computer language, ref1 Atwood, Jeff, ref1, ref2 Babbage, Charles, ref1, ref2 back-end code, ref1, ref2, ref3, ref4 backpropagation, ref1 “Bad Smells in Code” (Fowler and Beck), ref1 Baffler, The, ref1 Bahnken, A.
Jony Ive: The Genius Behind Apple's Greatest Products by Leander Kahney
Apple II, banking crisis, British Empire, Chuck Templeton: OpenTable:, Computer Numeric Control, Dynabook, global supply chain, interchangeable parts, John Markoff, Jony Ive, Kickstarter, race to the bottom, RFID, side project, Silicon Valley, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, the built environment, thinkpad, Tim Cook: Apple
It was more than four years since Brunner had written his conceptual brief. With the much-anticipated twentieth anniversary of the Macintosh approaching, the decision was made to designate Spartacus as a special edition. Officially named the Twentieth Anniversary Macintosh, the new product was limited to a run of just twenty thousnd units. Apple unveiled it at Macworld in January 1997 and the first two units were given to Steve Jobs and Steve Wozniak, who had just returned to the company as advisers. To make it more memorable, the machine was hand-delivered to customers’ homes by specially trained “concierges,” who set up the machines, installed any expansion cards (along with the ugly hunchback) and showed users how to use them. “I think it is the first sensible computer design that we have seen in a long time,” said Henry Steiner, Hong Kong’s most eminent graphic designer.
Like the MessagePad, the Twentieth Anniversary Macintosh (TAM) won not only kudos but awards, including the Best of Category prize for I.D. magazine’s Annual Design Review. Steve Wozniak thought it was the perfect college machine “with the computer, TV, radio, CD player and more (AV even) all in one sleek machine.” He had several at his mansion in the hills of Los Gatos above Silicon Valley. By the time the machine was pulled from the market one year after launch, however, Wozniak seemed to be the only person on the planet who liked it. The TAM bombed in the marketplace. The machine widely missed its mark. Originally priced at $9,000, within a year the list dropped to under $2,000. It was originally intended as a mainstream product, but the marketing group turned it into a pricey special edition. It was the last straw. After all the battles to get the TAM to market, Brunner had grown tired of Apple’s dysfunctional culture. Bye-bye, Brunner Just before the release of Twentieth Anniversary Mac, Brunner quit.
CHAPTER 12 Unibody Everywhere 1. Apple special event video, Oct 14: Apple Notebook Event 2008, New Way to Build 2-/6, 2008, video, http://www.youtube.com/watch?v=7JLjldgjuKI. 2. Ibid. 3. Interview with Doug Satzger, January 2013. 4. Apple special event video, Oct 14. 5. Interview with Chris Lefteri, October 2012. 6. Interview with a former Apple engineer, June 2013. 7. Personal interview, June 2013. 8. Interview with Dennis Boyle, October 2012. 9. Interview with a former Apple engineer, June 2013. 10. Horace Dediu, “How Much Do Apple’s Factories Cost?” http://www.asymco.com/2011/10/16/how-much-do-apples-factories-cost/ October 16, 2011. 11. Greenpeace, “Guide to Greener Electronics 18,” http://www.greenpeace.org/new-zealand/en/Guide-to-Greener-Electronics/18th-Edition/APPLE/, November 2012. 12.
Robot Rules: Regulating Artificial Intelligence by Jacob Turner
Ada Lovelace, Affordable Care Act / Obamacare, AI winter, algorithmic trading, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, autonomous vehicles, Basel III, bitcoin, blockchain, brain emulation, Clapham omnibus, cognitive dissonance, corporate governance, corporate social responsibility, correlation does not imply causation, crowdsourcing, distributed ledger, don't be evil, Donald Trump, easy for humans, difficult for computers, effective altruism, Elon Musk, financial exclusion, financial innovation, friendly fire, future of work, hive mind, Internet of things, iterative process, job automation, John Markoff, John von Neumann, Loebner Prize, medical malpractice, Nate Silver, natural language processing, nudge unit, obamacare, off grid, pattern recognition, Peace of Westphalia, race to the bottom, Ray Kurzweil, Rodney Brooks, self-driving car, Silicon Valley, Stanislav Petrov, Stephen Hawking, Steve Wozniak, strong AI, technological singularity, Tesla Model S, The Coming Technological Singularity, The Future of Employment, The Signal and the Noise by Nate Silver, Turing test, Vernor Vinge
Our approach remembers old tasks by selectively slowing down learning on the weights important for those tasks.36 Other research projects have focused on AI’s ability to plan and “imagine” possible consequences of its actions under conditions of uncertainty—another step in the progression from narrower to more general AI.37 Leading technology companies are now focusing dedicated projects on multipurpose AI.38 Indeed, to accomplish many everyday tasks requires not just one discrete acumen, but rather multiple skills. Apple co-founder Steve Wozniak alluded to this point when he suggested in 2007 that we would never develop a robot with the numerous different capabilities needed to enter an unfamiliar home and make a coffee.39 Kazuo Yano, the Corporate Chief Engineer of the Research and Development Group at technology conglomerate Hitachi, has said:Many new technologies are developed for many specific purposes… For example, mobile phones originated from phones specialized for car use.
So for a computer to do it the same way, it has to go through the same learning, walking to a house using some kind of optical with a vision system, stepping around and opening the door properly, going down the wrong way, going back, finding the kitchen, detecting what might be a coffee machine. You can’t program these things, you have to learn it, and you have to watch how other people make coffee. … This is a kind of logic that the human brain does just to make a cup of coffee. We will never ever have artificial intelligence. Your pet, for example, your pet is smarter than any computer”. Steve Wozniak, interviewed by Peter Moon, “Three Minutes with Steve Wozniak”, PC World, 19 July 2007. See also Luke Muehlhauser, “What Is AGI?”, MIRI, https://intelligence.org/2013/08/11/what-is-agi/, accessed 1 June 2018. 40Interview with Dr. Kazuo Yano, “Enterprises of the Future Will Need Multi-purpose AIs”, Hitachi Website, http://www.hitachi.co.jp/products/it/it-pf/mag/special/2016_02th_e/interview_ky_02.pdf, accessed 1 June 2018. 41UK Department of Transport, “The Pathway to Driverless Cars: Detailed Review of Regulations for Automated Vehicle Technologies”, UK Government Website, February 2015, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/401565/pathway-driverless-cars-main.pdf, accessed 1 June 2018. 42When in 2017 the UK’s House of Lords Science and Technology Select Committee published a report entitled “Connected and Autonomous Vehicles: The Future?”
See Mark Anderson and Victor Warner, Drafting and Negotiating Commercial Contracts (Haywards Heath: Bloomsbury Professional, 2016), 18. 87In some systems, the requirement for something of value to pass is known as “consideration”. 88However, it can also be the case that a contract , and indeed contractual terms, will be deemed to have been agreed by the parties as a result of their relationship. When a person buys a crate of apples, there is usually an implied term that those apples will not be full of maggots. 89Kirsten Korosec, “Volvo CEO: We Will Accept All Liability When Our Cars Are in Autonomous Mode”, Fortune, 7 October 2015, http://fortune.com/2015/10/07/volvo-liability-self-driving-cars/, accessed 1 June 2018. 90 EWCA Civ 1. 91Fumio Shimpo, “The Principal Japanese AI and Robot Strategy and Research toward Establishing Basic Principles”, Journal of Law and Information Systems, Vol. 3 (May 2018). 92Dirk A.
Alpha Girls: The Women Upstarts Who Took on Silicon Valley's Male Culture and Made the Deals of a Lifetime by Julian Guthrie
Airbnb, Apple II, barriers to entry, blockchain, Bob Noyce, call centre, cloud computing, credit crunch, disruptive innovation, Elon Musk, equal pay for equal work, fear of failure, game design, glass ceiling, hiring and firing, Jeff Bezos, Louis Pasteur, Lyft, Mark Zuckerberg, Menlo Park, Mitch Kapor, new economy, PageRank, peer-to-peer, pets.com, phenotype, place-making, Ronald Reagan, Rosa Parks, Sand Hill Road, Silicon Valley, Silicon Valley startup, Skype, Snapchat, software as a service, South of Market, San Francisco, stealth mode startup, Steve Jobs, Steve Wozniak, TaskRabbit, Tim Cook: Apple, Travis Kalanick, uber lyft, unpaid internship, upwardly mobile, urban decay, web application, William Shockley: the traitorous eight, women in the workforce
Even before she finished her master’s degree, Magdalena had gone to seven job interviews and received seven offers. Her first interview had been with Steve Jobs and Steve Wozniak of Apple. The founders invited double-e students to the LOTS computer center to hear a pitch about their three-year-old company. If the students liked what they heard, they could stay and be interviewed. Jobs, wearing wire-rimmed glasses and jeans, told Magdalena and the other double-e students that working for Apple would be like “an extension of college.” Magdalena was one of sixteen students who showed up for the interview. She loved the idea of working for Apple. Jobs had dropped out of college and spent time studying Hinduism and Buddhism in India. He was a technologist but also a student of the mind. Magdalena had a window opened to her own mind through her love of technology.
It was ruled by men: Samuel Brannan, Levi Strauss, John Studebaker, Henry Wells, and William Fargo. Women, outnumbered and overmatched, were mostly reduced to entertainers, companions, wives, or housekeepers. Things were not that different in the more recent gold rush. The Valley was always a region dominated by men, from William Hewlett, Dave Packard, Bob Noyce, Gordon Moore, Andy Grove, Larry Ellison, Steve Jobs, and Steve Wozniak to, decades later, in the twenty-first century, Larry Page, Sergey Brin, Mark Zuckerberg, Elon Musk, Tim Cook, Travis Kalanick, and Marc Benioff. Mary Jane, fueled by peanut butter sandwiches packed in wax paper for the two-day journey, was under no illusion that it would be easy to navigate the old boys’ club of Sand Hill Road and Silicon Valley. Even today, decades after Mary Jane first arrived, 94 percent of investing partners at venture capital firms—the financial decision makers shaping the future—are men, and more than 80 percent of venture firms have never had a woman investing partner.
Magdalena knew that the spirit of entrepreneurship had always been a part of Marc’s life. When he was a teenager, he and a few friends started a company called Liberty Software to make adventure games for the Atari 800. He earned extra money going to people’s homes to repair antennas and CB radios, and he worked for Apple Computer the summer before his junior year at the University of Southern California. Largely unsupervised, Marc started writing software for a game about raiding IBM’s headquarters. His manager at Apple said the game wasn’t appropriate and later suggested Marc consider a job at Oracle. They had the best salespeople in the world, he was told. In his first year at Oracle, Benioff was named rookie of the year. After lunch at the country club, Magdalena jumped on a call with Benioff and Parker Harris, who was running a software programming and consulting company called Left Coast Software with two engineers.
Disrupted: My Misadventure in the Start-Up Bubble by Dan Lyons
activist fund / activist shareholder / activist investor, Airbnb, Ben Horowitz, Bernie Madoff, bitcoin, call centre, cleantech, cloud computing, corporate governance, disruptive innovation, dumpster diving, fear of failure, Filter Bubble, Golden Gate Park, Google Glasses, Googley, Gordon Gekko, hiring and firing, Jeff Bezos, Lean Startup, Lyft, Marc Andreessen, Mark Zuckerberg, Menlo Park, minimum viable product, new economy, Paul Graham, pre–internet, quantitative easing, ride hailing / ride sharing, Rosa Parks, Sand Hill Road, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, Snapchat, software as a service, South of Market, San Francisco, Stanford prison experiment, Steve Ballmer, Steve Jobs, Steve Wozniak, telemarketer, tulip mania, uber lyft, Y Combinator, éminence grise
Then he will pause, as if he has just said something incredibly profound and wants to give you a moment to let it sink in. Then he repeats the line, and a ballroom full of marketing people cheer. But when I meet them together it occurs to me that their different personalities are probably why their partnership works. There’s a yin-and-yang quality, like the one between Steve Jobs and Steve Wozniak, the co-founders of Apple. Halligan is the Jobs figure, the corporate visionary, the guy who thinks about sales and marketing. Shah is like Woz, the nerdy software programmer. Shah is wearing scruffy jeans and a rumpled T-shirt, his usual attire. He has dark hair and a dark beard, flecked with gray. Halligan wears jeans, and a sports jacket over a button-down oxford shirt. His hair is gray, as gray as my own, in fact, and he wears the same kind of chunky horn-rimmed glasses that I do.
Twelve The New Work: Employees as Widgets It turns out I’ve been naïve. I’ve spent twenty-five years writing about technology companies, and I thought I understood this industry. But at HubSpot I’m discovering that a lot of what I believed was wrong. I thought, for example, that tech companies began with great inventions—an amazing gadget, a brilliant piece of software. At Apple Steve Jobs and Steve Wozniak built a personal computer; at Microsoft Bill Gates and Paul Allen developed programming languages and then an operating system; Sergey Brin and Larry Page created the Google search engine. Engineering came first, and sales came later. That’s how I thought things worked. But HubSpot did the opposite. HubSpot’s first hires included a head of sales and a head of marketing. Halligan and Dharmesh filled these positions even though they had no product to sell and didn’t even know what product they were going to make.
In their mind, HubSpot belongs to them, not to these interlopers and outsiders who are now storming into the place and writing memos and telling everybody how they should be doing their jobs. Many of these people have never worked anywhere else. A lot of them aren’t very good. But here, they’re in charge. And I’m stuck working under them. Eight The Bozo Explosion Apple CEO Steve Jobs used to talk about a phenomenon called a “bozo explosion,” by which a company’s mediocre early hires rise up through the ranks and end up running departments. The bozos now must hire other people, and of course they prefer to hire bozos. As Guy Kawasaki, who worked with Jobs at Apple, puts it: “B players hire C players, so they can feel superior to them, and C players hire D players.” That’s the bozo explosion, and that’s what I believe has happened at HubSpot in the course of the last seven years. “How weird are you, on a scale from one to ten?”
Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do by Jeremy Bailenson
Apple II, augmented reality, computer vision, deliberate practice, experimental subject, game design, Google Glasses, income inequality, Intergovernmental Panel on Climate Change (IPCC), iterative process, Jaron Lanier, low earth orbit, Mark Zuckerberg, Marshall McLuhan, meta analysis, meta-analysis, Milgram experiment, nuclear winter, Oculus Rift, randomized controlled trial, Silicon Valley, Skype, Snapchat, Steve Jobs, Steve Wozniak, Steven Pinker, telepresence, too big to fail
I was reminded of this while giving a talk at a technology conference in 2016 alongside Steve Wozniak, cofounder of Apple. Woz is high on VR—his first HTC Vive experience gave him goose bumps. But he cautions about overspecifying use cases. He told the story of the early days at Apple, and how when he and Steve Jobs made the Apple II, they conceived of it as a home appliance for computer enthusiasts, and believed users would use it to play games, or to store and access recipes in the kitchen. But it turned out it was good for unexpected applications. Sales really took off when a spreadsheet program was developed and suddenly people could do office work from home. According to Wozniak, he and Jobs were wrong about exactly what the Apple II would be used for. They knew they’d created something revolutionary, but they were mistaken in what that revolution meant.
An industry is already growing in Hollywood and Silicon Valley to explore VR as a space for fictional narratives, and within it storytellers from Hollywood and the gaming world, with technical help from technology companies, are beginning to take the tentative early steps in defining the grammar of virtual storytelling. Brett Leonard was a young filmmaker fresh from his hometown of Toledo, Ohio when he landed in Santa Cruz just before the beginning of the first VR boom in 1979. There he fell in with future Silicon Valley icons like Steve Wozniak, Steve Jobs, and Jaron Lanier. Jaron is also one of our most incisive and visionary thinkers about technology and its effects on human commerce and culture; at the time, as well as now, he was the very public face of virtual reality, a term he coined and popularized. Leonard was a big fan of technology and science fiction when his trip to Silicon Valley dropped him into what must have seemed the most interesting place in the world, amidst a group of people who were already playing a significant role in shaping the future.
ABC News, 210–11 Abrash, Michael, 180 absence, vs. presence, 250 ADHD, 147 adult entertainment, 52, 247 advertising, 249 advocacy. See VR advocacy ageism, 84–85, 87–88, 93–95 Ahn, Sun Joo, 90–91, 114–16 AI (artificial intelligence), 225 air rage incidents, 211–12 Alito, Samuel, 65 Allen, Barbara, 203–4 Alzheimer’s, 146 Andreesen-Horowitz, 7 animal rights, 102–7 animal studies, 55–56 animated shorts, 226 anonymity, 201 Apollo missions, 108–10 Apple Computers Incorporated, 11, 195 Apple II, 11 archaeological sites, VR reconstruction of, 246 ARCHAVE system, 246 Arizona Cardinals, 16–17, 18, 33 astronomy, 108–10 AT&T, 179 athletics, 14–18, 27–28, 30–39, 100–102 attention, 36, 71, 219–20 audiovisual technology, 25–26 Augmented Reality, 71 avatar limb movements, 161, 162–66 avatars, 61–62, 86–87, 94, 99, 248–49 animal, 103–5 facial expressions and, 194–201 photorealistic, volumetric, 208 race and, 88–89 teacher, 242–44 three-dimensional, 168 vs. video, 195–200 avatar trolling, 202 aviation, 24 baby-boomers, aging of, 153 Bagai, Shaun, 176–77, 179–80, 188 Baidu, 235 Bailenson, Jeremy, 1–5, 7–9, 18, 35–36, 84, 99, 100–104, 112–13, 248–50 “Effects of Mass Media” course, 60 “The Effects of Mass Media” course at Stanford, 51 Infinite Reality, 111, 226 Virtual People class, 29–30, 58–59, 66, 242 Banaji, Mahzarin, 89 Bandura, Albert, 59–60 bandwidth, 197–98 Baron, Marty, 204 Beall, Andy, 29 “Becoming Homeless,” 98 behavior modeling, 59–65, 252–53 Belch, Derek, 29–30, 33–34 Bernstein, Lewis, 229–32 Blascovich, Jim, 226, 248 blindness, 92 Bloom, Benjamin S., 243 Bloomgren, Mike, 32 Bobo-doll study, 59–60 body adaptation, studies of, 165–68 body maps, 164–65 body movements, 181–84 measuring, 22 utilization of, 38 body transfer, 86, 92–95, 104–5 Bolas, Mark, 7, 29 Bostick, Joshua, 103–7 boundary effect, 94–95 Bower, Jim, 165 Brady, Mathew, 205–6 brain mapping, 167 brains, effect of VR on, 52–58 brain science, behavior modeling and, 60–61 BRAVEMIND, 148 Breivik, Anders Behring, 65 Falconer, Caroline, 106–7 Bridgeway Island Elementary School, 132–33 Brown, Thackery, 54 Brown, Veronica, 134 Brown v.
The Simulation Hypothesis by Rizwan Virk
3D printing, Albert Einstein, Apple II, artificial general intelligence, augmented reality, Benoit Mandelbrot, bioinformatics, butterfly effect, discovery of DNA, Dmitri Mendeleev, Elon Musk, en.wikipedia.org, Ernest Rutherford, game design, Google Glasses, Isaac Newton, John von Neumann, Kickstarter, mandelbrot fractal, Marc Andreessen, Minecraft, natural language processing, Pierre-Simon Laplace, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, Silicon Valley, Stephen Hawking, Steve Jobs, Steve Wozniak, technological singularity, Turing test, Vernor Vinge, Zeno's paradox
Nevertheless, the video game pioneers of that time persisted, squeezing every bit of performance out of the limited hardware and memory of the day to create these early arcade games. A well-known anecdote from Silicon Valley at the time involves future Apple Computer co-founders Steve Jobs and Steve Wozniak. Jobs worked for Nolan Bushnell, the founder of Atari, and promised his boss that he could build a certain game quickly and using limited memory resources. Bushnell was skeptical but gave him the project. At night, Jobs brought in his friend, Steve Wozniak, who, created the game at night after his full-time engineering job. Wozniak, of course, as the future creator of the first Apple computers, is acknowledged today as a hardware genius. In some ways, the history of video games is the history of optimizing very limited resources. Without these optimization techniques, the entire field of computer graphics (and thus video games and digital media) would not be possible, nor would we have traveled very far down the road to the simulation point.
Beyond the bleachers, there was a sky with clouds and a city or country landscape that was only partly visible. I found myself wondering how far this “simulated world” extended in all directions beyond the track. What happened when no one was playing the video game? Did the characters and the buildings still exist, or did they simply cease to exist? Although I learned to program rudimentary video games myself shortly thereafter when my parents bought my brother and I a Commodore 64 (and later, an Apple II), it would be many years before I understood video game development well enough to answer these types of questions. The first game I ever created was Tic Tac Toe, basically putting blocky lines on the screen and then figuring out how to get the computer to “draw” Xs and Os on the squares selected by the players. My brother and I would play each other, but after he got bored I figured I could play against the computer.
At the same time, I watched as video game fidelity improved, moving from 8-bit to 16-bit games, and the world “in there” started to look more and more realistic. More than a decade after that, I moved to Silicon Valley at the start of the mobile gaming revolution. I designed a number of different games, including Tap Fish—one of the most popular games of its type (a resource management game, also called a simulation game), having reached more than 30 million downloads in the early days of the Apple iPhone. Later, I designed multiplayer competitive games based on popular TV shows like Penny Dreadful and Grimm and became an advisor to and investor in many video game companies. During these years, games evolved from simple adventure and arcade games to fully 3D massively multiplayer online role-playing games (MMORPGs), such as Ultima Online and World of Warcraft. Some games were actually virtual worlds, such as Second Life and The Sims, in which the goal was more about “simulating” life and less about fighting monsters.
The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity by Byron Reese
agricultural Revolution, AI winter, artificial general intelligence, basic income, Buckminster Fuller, business cycle, business process, Claude Shannon: information theory, clean water, cognitive bias, computer age, crowdsourcing, dark matter, Elon Musk, Eratosthenes, estate planning, financial independence, first square of the chessboard, first square of the chessboard / second half of the chessboard, full employment, Hans Rosling, income inequality, invention of agriculture, invention of movable type, invention of the printing press, invention of writing, Isaac Newton, Islamic Golden Age, James Hargreaves, job automation, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, lateral thinking, life extension, Louis Pasteur, low skilled workers, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Mary Lou Jepsen, Moravec's paradox, On the Revolutions of the Heavenly Spheres, pattern recognition, profit motive, Ray Kurzweil, recommendation engine, Rodney Brooks, Sam Altman, self-driving car, Silicon Valley, Skype, spinning jenny, Stephen Hawking, Steve Wozniak, Steven Pinker, strong AI, technological singularity, telepresence, telepresence robot, The Future of Employment, the scientific method, Turing machine, Turing test, universal basic income, Von Neumann architecture, Wall-E, Watson beat the top human players on Jeopardy!, women in the workforce, working poor, Works Progress Administration, Y Combinator
Bill Gates threw his hat in the ring on the side of the concerned: “I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.” Jaan Tallinn, one of the cofounders of Skype, refers to AI as “one of the many potential existential risks.” He goes on to add, optimistically, that “if we get the AI right, then basically we’ll be able to address all the other existential risks.” Steve Wozniak, cofounder of Apple, looks at it this way: “If a computer is a hundred times better than our brain, will it make the world perfect? Probably not, it will probably end up just like us, fighting.” And finally, the Oxford University philosopher Nick Bostrom likened the current effort to build an AGI to “children playing with a bomb.” Others in the industry find such doomsday concerns to be misguided. Andrew Ng, one of the most respected AI experts on the planet, says, “There’s also a lot of hype, that AI will create evil robots with super-intelligence.
It is believed that computers use roughly 10 percent of all of the electricity produced. They are so much a part of our lives that we literally may not be able to live without them, certainly not at our present standard of living. But our population may be large enough that without computers in the background managing everything from logistics to water treatment, their removal or incapacitation might cause a die-off of humans, especially in large cities. As Steve Wozniak said, “All of a sudden, we’ve lost a lot of control. We can’t turn off our Internet; we can’t turn off our smartphones; we can’t turn off our computers. You used to ask a smart person a question. Now who do you ask? It starts with g-o, and it’s not God.” In the 1960s and 1970s, we were building enough computers that it made sense to connect them to make one giant network. We call that the Internet.
We’ve already discussed at length how this came about, how the scientific method launched the Industrial Revolution and began an age of unprecedented innovation. Yet poverty remains. Yes, it has fallen substantially, but while the average per capita income on the planet is about $30 a day, a billion people get by on just $2 a day. What is their pathway out of poverty? While we can talk meaningfully about poverty rates in an apples-to-apples way starting around 1900, it bears little on our present situation, so let’s jump forward to 1980. The world’s population was four billion, and half of them lived on less than $2 a day, adjusted for inflation. By 1990, that number had fallen to 35 percent. In that year, the United Nations set a goal of halving poverty in twenty-five years. They reached that goal five years early, and so reconvened in 2010 to set a new goal: completely end poverty by 2030, a goal that we have a good shot at reaching.
Only Humans Need Apply: Winners and Losers in the Age of Smart Machines by Thomas H. Davenport, Julia Kirby
AI winter, Andy Kessler, artificial general intelligence, asset allocation, Automated Insights, autonomous vehicles, basic income, Baxter: Rethink Robotics, business intelligence, business process, call centre, carbon-based life, Clayton Christensen, clockwork universe, commoditize, conceptual framework, dark matter, David Brooks, deliberate practice, deskilling, digital map, disruptive innovation, Douglas Engelbart, Edward Lloyd's coffeehouse, Elon Musk, Erik Brynjolfsson, estate planning, fixed income, follow your passion, Frank Levy and Richard Murnane: The New Division of Labor, Freestyle chess, game design, general-purpose programming language, global pandemic, Google Glasses, Hans Lippershey, haute cuisine, income inequality, index fund, industrial robot, information retrieval, intermodal, Internet of things, inventory management, Isaac Newton, job automation, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joi Ito, Khan Academy, knowledge worker, labor-force participation, lifelogging, longitudinal study, loss aversion, Mark Zuckerberg, Narrative Science, natural language processing, Norbert Wiener, nuclear winter, pattern recognition, performance metric, Peter Thiel, precariat, quantitative trading / quantitative ﬁnance, Ray Kurzweil, Richard Feynman, risk tolerance, Robert Shiller, Robert Shiller, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, six sigma, Skype, social intelligence, speech recognition, spinning jenny, statistical model, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, superintelligent machines, supply-chain management, transaction costs, Tyler Cowen: Great Stagnation, Watson beat the top human players on Jeopardy!, Works Progress Administration, Zipcar
Bureau of Labor Statistics) to expand by 19 percent from 2008 to 2018, faster than most other jobs. Before we leave the subject of taste, we shouldn’t neglect to mention Steve Jobs again. It might strike you as odd that, in a book about the encroachment of computers into knowledge work, the name of Apple’s iconic founder would come up in the chapter devoted to stepping aside. But note that whenever Jobs’s genius is mentioned, the emphasis is on his sensibilities—his taste. No one denies he had strong technical knowledge, but according to his cofounder, Steve Wozniak, “Steve didn’t ever code. He wasn’t an engineer and he didn’t do any original design, but he was technical enough to alter and change and add to other designs.”4 As an undergrad at Reed College, Jobs studied physics, but also literature, poetry, and calligraphy. Jobs’s genius was the judicious tweak, and his extreme success comes down to the fact that he focused his time on those points where a tweak he could make would make all the difference.
Walter Kirn, “The Tao of Robert Downey, Jr.,” Rolling Stone, May 13, 2010. 2. Tricia Drevets, “How to Make Money Living off the Grid,” Off the Grid News, June 25, 2014, http://www.offthegridnews.com/financial/how-to-make-money-living-off-the-grid/. 3. Heather Plett, “What It Means to ‘Hold Space’ for People, plus Eight Tips on How to Do It Well,” Heather Plett blog, March 11, 2015, http://heatherplett.com/2015/03/hold-space/. 4. Steve Wozniak, “Does Steve Jobs Know How to Code?,” response to email posted on Woz.org, http://www.woz.org/letters/does-steve-jobs-know-how-code. 5. Dan Ariely, Predictably Irrational: The Hidden Forces That Shape Our Decisions, revised and expanded edition (New York: Harper Perennial, 2010). 6. Howard Gardner, Frames of Mind: The Theory of Multiple Intelligences, 3rd ed. (New York: Basic Books, 2011). 7.
First, machines relieved humans of work that was manually exhausting and mentally enervating. This was the story of the late industrial revolution, which, having pulled all those workers off farms and into factories, proceeded to make most of them unnecessary with contraptions like the flying shuttle, the spinning jenny, and the power loom. And it’s a process that continues around the world. Consider Foxconn, the Chinese manufacturing subcontractor to global electronics brands like Apple. Starting in 2011, it started putting robots on the lines to perform welding, polishing, and such tasks—ten thousand of them that first year. In 2013, Chairman Terry Gou noted at Foxconn’s annual meeting that the firm now employed over a million people. But, he was quick to add: “In the future we will add one million robotic workers.”1 If that goal is realized, it will mean, of course, that some hundreds of thousands of human workers will never get hired—a loss of jobs for the local economy.
Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think by James Vlahos
Albert Einstein, AltaVista, Amazon Mechanical Turk, Amazon Web Services, augmented reality, Automated Insights, autonomous vehicles, Chuck Templeton: OpenTable:, cloud computing, computer age, Donald Trump, Elon Musk, information retrieval, Internet of things, Jacques de Vaucanson, Jeff Bezos, lateral thinking, Loebner Prize, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, Mark Zuckerberg, Menlo Park, natural language processing, PageRank, pattern recognition, Ponzi scheme, randomized controlled trial, Ray Kurzweil, Ronald Reagan, Rubik’s Cube, self-driving car, sentiment analysis, Silicon Valley, Skype, Snapchat, speech recognition, statistical model, Steve Jobs, Steve Wozniak, Steven Levy, Turing test, Watson beat the top human players on Jeopardy!
But instead, due to poor language understanding, the “profoundly disappointing” assistant was a “gimmick and a tease.” Apple ran commercials in which Zooey Deschanel, Samuel L. Jackson, John Malkovich, and Martin Scorsese stumped for Siri. But some users, feeling that those ads made false claims, filed class-action lawsuits against Apple claiming deception. Steve Wozniak, one of the original cofounders of Apple, got his licks in, too, implying to a reporter that Siri had worked better before the company had been acquired. Even Jack in the Box ran an ad that lampooned the speech recognition of a Siri-like virtual assistant. “Where’s the nearest Jack in the Box?” Jack asks the assistant during the ad. “I found four places that sell socks,” the assistant replies. Apple, in part, was paying a penalty for being first to market with an ambitious but immature technology.
Titans 39 Decades before he founded Amazon: Amazon CEO Jeff Bezos on how he got a role in Star Trek Beyond, posted to YouTube on October 23, 2016, https://goo.gl/RJKBL1. 39 “build space hotels”: Luisa Yanez, “Jeff Bezos: A rocket launched from Miami’s Palmetto High,” Miami Herald, August 5, 2013, https://goo.gl/GxFrx8. 40 After the discussion with Hart: Greg Hart, interview with author, April 27, 2018. 41 “If we could build it”: this and subsequent quotes from Greg Hart come from interview with author, April 27, 2018. 41 “We think it [the project] is critical to Amazon’s success”: this and subsequent quotes from Al Lindsay, unless otherwise identified, come from interview with author, April 4, 2018. 42 Rohit Prasad, a scientist whom Amazon hired: Rohit Prasad, interview with author, April 2, 2018. 44 Bezos was reportedly aiming for the stars: Joshua Brustein, “The Real Story of How Amazon Built the Echo,” Bloomberg Businessweek, April 19, 2016, https://goo.gl/4SIi8F. 44 “hero feature”: Prasad, interview with author. 44 An article in Bloomberg Businessweek : Brustein, “The Real Story of How Amazon Built the Echo.” 45 “Amazon just surprised everyone”: Chris Welch, “Amazon just surprised everyone with a crazy speaker that talks to you,” The Verge, November 6, 2014, https://goo.gl/sVgsPi. 45 “Don’t laugh at or ignore”: Mike Elgan, “Why Amazon Echo is the future of every home,” Computerworld, November 8, 2014, https://goo.gl/wriJXE. 45 “the happiest person in the world”: this and other quotes from Adam Cheyer, unless otherwise indicated, come from interviews with author, April 19 and 23, 2018. 45 “Apple’s digital assistant was delivered”: Farhad Manjoo, “Siri Is a Gimmick and a Tease,” Slate, November 15, 2012, https://goo.gl/2cSoK. 46 Steve Wozniak, one of the original cofounders of Apple: Bryan Fitzgerald, “‘Woz’ gallops in to a horse’s rescue,” Albany Times Union, June 13, 2012, https://goo.gl/dPdHso. 46 Even Jack in the Box ran an ad: Yukari Iwatani Kane, Haunted Empire: Apple After Steve Jobs (New York: HarperCollins, 2014), 154. 46 Years later, some people who had worked: Aaron Tilley and Kevin McLaughlin, “The Seven-Year Itch: How Apple’s Marriage to Siri Turned Sour,” The Information, March 14, 2018, https://goo.gl/6e7BxM. 48 “artificially-intelligent orphan”: Bosker, “Siri Rising.” 48 “Siri’s various teams morphed”: Tilley and McLaughlin, “The Seven-Year Itch.” 48 John Burkey, who was part: John Burkey, interview with author, June 19, 2018. 49 “it’s really the first time in history”: Megan Garber, “Sorry, Siri: How Google Is Planning to Be Your New Personal Assistant,” The Atlantic, April 29, 2013, https://goo.gl/XFLPDP. 49 “We are not shipping”: Dan Farber, “Microsoft’s Bing seeks enlightenment with Satori,” CNET, July 30, 2013, https://goo.gl/fnLVmb. 50 CNN Tech ran an emblematic headline: Adrian Covert, “Meet Cortana, Microsoft’s Siri,” CNN Tech, April 2, 2014, https://goo.gl/pyoW4v. 50 “feels like a potent mashup of Google Now’s worldliness”: Chris Velazco, “Living with Cortana, Windows 10’s thoughtful, flaky assistant,” Engadget, July 30, 2015, https://goo.gl/mbZpon. 50 “arrogant disdain followed by panic”: Burkey, interview with author. 51 “I’ll start teaching it”: Mark Zuckerberg, “Building Jarvis,” Facebook blog, December 19, 2016, https://goo.gl/DyQSBN. 51 Zuckerberg might have to say a command: Daniel Terdiman, “At Home With Mark Zuckerberg And Jarvis, The AI Assistant He Built For His Family,” Fast Company, December 19, 2016, https://goo.gl/qJNIxW. 51 One lucky user who tested M: Alex Kantrowitz, “Facebook Reveals The Secrets Behind ‘M,’ Its Artificial Intelligence Bot,” BuzzFeed, November 19, 2015, https://goo.gl/bwmFyN. 52 “an experiment to see what people would ask”: Kemal El Moujahid, interview with author, September 29, 2017. 54 “just the tip of the iceberg”: Mark Bergen, “Jeff Bezos says more than 1,000 people are working on Amazon Echo and Alexa,” Recode, May 31, 2016, https://goo.gl/hhSQXc. 59 “When you speak”: Robert Hoffer, interview with author, April 30, 2018. 4.
Curiously, though, Apple hasn’t taken a serious shot at the search business. Apple emphasizes Siri’s ability to help users accomplish tasks, especially those that utilize apps within the company’s ecosystem, and has traditionally downplayed the importance of general question answering. Apple, in fact, has always negotiated partnerships with other companies to supply many of Siri’s search results. These partners have included Microsoft, Yahoo, Wolfram Alpha, and Google. Apple is famously tight-lipped about its business strategies. But the rationale for steering clear of search likely goes something like this: Apple became the world’s most valuable company by selling stuff, not services. So long as iPhones and other devices made by the company continue to fly off the shelves, Apple doesn’t need to elbow its way into search.
Brotopia: Breaking Up the Boys' Club of Silicon Valley by Emily Chang
23andMe, 4chan, Ada Lovelace, affirmative action, Airbnb, Apple II, augmented reality, autonomous vehicles, barriers to entry, Bernie Sanders, Burning Man, California gold rush, Chuck Templeton: OpenTable:, David Brooks, Donald Trump, Elon Musk, equal pay for equal work, Ferguson, Missouri, game design, gender pay gap, Google Glasses, Google X / Alphabet X, Grace Hopper, high net worth, Hyperloop, Jeff Bezos, job satisfaction, Khan Academy, Lyft, Marc Andreessen, Mark Zuckerberg, Maui Hawaii, Menlo Park, meta analysis, meta-analysis, microservices, paypal mafia, Peter Thiel, post-work, pull request, ride hailing / ride sharing, rolodex, Saturday Night Live, shareholder value, side project, Silicon Valley, Silicon Valley startup, Skype, Snapchat, Steve Jobs, Steve Wozniak, Steven Levy, subscription business, Tim Cook: Apple, Travis Kalanick, uber lyft, women in the workforce
That year was the high point for the percentage of women earning degrees in computer science. As the number of overall computer science degrees picked back up leading into the dot-com boom, more men than women were filling those coveted seats. In fact, the percentage of women in the field would dramatically decline for the next two and a half decades. APPLE UPSETS THE NERD CART As women were leaving the tech world, a new type of tech hero was taking center stage. In 1976, Apple was co-founded by Steve Wozniak, your typical nerd, and Steve Jobs, who was not your typical nerd at all. Jobs exuded a style and confidence heretofore unseen in the computer industry. He had few technical skills—Wozniak handled all that—yet Jobs was a never-before-seen kind of tech rock star. He proved you could rise on the strength of other skills, such as conviction, product vision, marketing genius, and a willingness to take risks.
The cherry on top: when the troll responded inappropriately to a tweet in which I had tagged IBM CEO Ginni Rometty, after an interview I had conducted with her, Rometty herself was alerted with several cheerful notifications from Twitter. I’ve developed the requisite thick skin, and I use a common tactic for dealing with trolls: ignoring them. I quickly scroll past the vitriolic direct replies to my Twitter account, and I never, ever use Reddit. Once an interview I conducted with Apple’s co-founder Steve Wozniak ended up on Reddit, and the response was worse than unnerving. (For the same reason, many women in tech avoid using Hacker News, the prominent start-up incubator YCombinator’s official bulletin board that has since become one of the industry’s leading message boards; the trolls are there too.) Most important, I don’t respond to the haters. This is accepted wisdom among many female users: the worst way to deal with a troll is to poke it.
“Of course I do”: Sheryl Sandberg, “Sheryl Sandberg: Bloomberg Studio 1.0 (Full Show),” interview by author, Bloomberg, Aug. 9, 2017, video, 24:16, https://www.bloomberg.com/news/videos/2017-08-10/sheryl-sandberg-bloomberg-studio-1-0-full-show-video. “broader group of employees”: “Uber Report: Eric Holder’s Recommendations for Change.” A true marvel: Steven Levy, “One More Thing: Inside Apple’s Insanely Great (or Just Insane) New Mothership,” Wired, May 16, 2017, https://www.wired.com/2017/05/apple-park-new-silicon-valley-campus. “everything an Apple employee”: Beth Spotswood, “Apple’s Campus Has Everything—Oh, Except Daycare,” SFist, May 19, 2017, http://sfist.com/2017/05/19/apples_campus_has_everything_-_oh_e.php. Because child care has proven: Rose Marcario, “Patagonia’s CEO Explains How to Make On-Site Child Care Pay for Itself,” Fast Company, Aug. 15, 2016, https://www.fastcompany.com/3062792/patagonias-ceo-explains-how-to-make-onsite-child-care-pay-for-itself.
All Your Base Are Belong to Us: How Fifty Years of Video Games Conquered Pop Culture by Harold Goldberg
activist lawyer, Alexey Pajitnov wrote Tetris, Apple II, cellular automata, Columbine, Conway's Game of Life, G4S, game design, In Cold Blood by Truman Capote, Mars Rover, Mikhail Gorbachev, Ralph Waldo Emerson, Ray Oldenburg, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, The Great Good Place, Thorstein Veblen, urban planning
Tobey spent most of his time at the computer trying to make a game that was as close to real life as a computer in the 1980s could make it. Through word of mouth, Tobey’s flying and shooting game based on F-15 fighter jets came to the attention of Apple’s Steve Wozniak when Tobey was just sixteen. Wozniak was wowed at the sound, graphics, and game play. He kept saying, “This can’t be done on the Apple II. I can’t believe it. This can’t be done.” He gave Tobey a calling card and added a note to Trip Hawkins, which read, “Please consider this flight simulator as the finest Apple game ever done.” Hawkins didn’t waste any time. He wanted to make a deal right away. Tobey’s parents came with him to EA’s offices to oversee a lucrative royalty deal for Skyfox, a game that would eventually sell more than a million copies.
From plastic dust they were born and to plastic dust and desert sand they returned. In 1975 that plastic hadn’t been worthless at all. It was precious gold to the principals of Atari, and it would only become more valuable as the decade progressed. Atari’s arcade business was still thriving, and Home Pong exceeded sales expectations, and demand exceeded supply. Alcorn hired an unkempt and unshaven Steve Jobs, who in turn asked his best friend, the diffident genius Steve Wozniak, for help with what would be one of Atari’s most popular additions to its ever expanding library. Without telling Alcorn, Bushnell asked Jobs to help him streamline the innards of a brick-breaking arcade game called Breakout. Bushnell wanted to save money because the chips used in each arcade machine were still pricey at the time. He coaxed the brazen, odoriferous Jobs with $750 and a $100 bonus for each chip removed from the prototype.
When he finished, he still couldn’t find the right job in the nascent world of games. So he took at job at Apple Computer. As employee number sixty-eight and the company’s inaugural MBA, Hawkins was the first person at Apple to tackle the job of marketing. Within a year, Hawkins had worked his way up to an executive position at Apple. He was in the right place at the right time. Apple was the “it” company. Like Apple today, with the iPod and iPhone, the company could do little wrong. The media hyped the Apple II personal computers, and business (“an elixir for U.S. industry,” glowed the New York Times) and families loved the quality the technicians put into each piece of equipment. The computers, although fairly expensive, almost sold themselves, so much so that in his four years at Apple, Hawkins became a rich man with a niche he carefully carved for himself and his team: selling the computers to medium and large businesses.
What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry by John Markoff
Any sufficiently advanced technology is indistinguishable from magic, Apple II, back-to-the-land, beat the dealer, Bill Duvall, Bill Gates: Altair 8800, Buckminster Fuller, California gold rush, card file, computer age, computer vision, conceptual framework, cuban missile crisis, different worldview, Donald Knuth, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Thorp, El Camino Real, Electric Kool-Aid Acid Test, general-purpose programming language, Golden Gate Park, Hacker Ethic, hypertext link, informal economy, information retrieval, invention of the printing press, Jeff Rulifson, John Markoff, John Nash: game theory, John von Neumann, Kevin Kelly, knowledge worker, Mahatma Gandhi, Menlo Park, Mother of all demos, Norbert Wiener, packet switching, Paul Terrell, popular electronics, QWERTY keyboard, RAND corporation, RFC: Request For Comment, Richard Stallman, Robert X Cringely, Sand Hill Road, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, speech recognition, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, The Hackers Conference, Thorstein Veblen, Turing test, union organizing, Vannevar Bush, Whole Earth Catalog, William Shockley: the traitorous eight
At various times Engelbart has said that he found the original article in the library and at other times he has said he believed he first read the Life account of Vannevar Bush’s Memex. Whatever the case, it had a defining impact on him. 5.Vannevar Bush, “As We May Think,” Atlantic Monthly, July 1945. 6.Lowood and Adams, oral history. 7.Ibid. Twenty years later, a young Steve Wozniak, then a brand-new HP engineer, would ask the company if they wanted to sell a personal computer. HP said it wasn’t interested, and Wozniak went off to cofound Apple Computer. It was the second time the Silicon Valley pioneer missed an opportunity to define the future of computing. 8.Ibid. 9.Jack Goldberg, Stanford Research Institute, e-mail to author. 10.Author interview, Charles Rosen, Menlo Park, Calif., October 10, 2001. 11.Douglas C. Engelbart Collection, Stanford Special Libraries, Stanford University. 12.Author interview, Don Allen, Menlo Park, Calif., August 31, 2001. 13.Myron Stolaroff, Thanatos to Eros, 35 Years of Psychedelic Exploration (Berlin: VWB, 1994), p. 18. 14.Stolaroff, Thanatos to Eros, p. 19. 15.Ibid. 16.Ibid, p. 20. 17.Jay Stevens, Storming Heaven: LSD and the American Dream (New York: Grove Press, 1987), p. 53. 18.Stolaroff, Thanatos to Eros, p. 23. 19.Ibid., p. 25. 20.Kary Mullis, Dancing Naked in the Mind Field, New York: Pantheon Books, 1998. 21.Author interview, Don Allen, Menlo Park, Calif., August 22, 2001. 22.Vic Lovell, “The Perry Lane Papers (III): How It Was,” in One Lord, One Faith, One Cornbread, eds.
Terminal, TV typewriter? I/O Device? or some other digital black-magic box? Or are you buying time on a time-sharing service? If so you might like to come to a gathering of people with like-minded interests. Exchange information, swap ideas, talk shop, help work on a project, whatever…13 One person who saw the flyer was Allen Baum, who was working at Hewlett-Packard at the time with his friend Steve Wozniak. The two had met in high school when Baum had seen Wozniak sitting in his homeroom class drawing strange graphics in a notebook. “What are you doing?” Baum asked. “I’m designing a computer,” was Wozniak’s reply. It turned out that Baum had on his own become intrigued with computers just months earlier after his father, who had moved the family from the East Coast, took a job at Stanford Research Institute.
It seemed inevitable that the old order would collapse and that a different, more spiritual path—to somewhere—lay just ahead. For some of Silicon Valley’s most influential figures, the connection between personal computing and the counterculture has not been forgotten. Early in 2001, I met with Apple’s cofounder, Steve Jobs. I have interviewed Jobs dozens of times over two decades and have come to know his moods well. This was not one of our better conversations. A photographer had accompanied me, and if there is one way to insure that Apple’s mercurial chief executive will be irritated, it is to attempt to take his picture during an interview. After only a handful of photographs, Jobs threw the photographer out, and things went downhill from there. Jobs was in a particularly bad mood. However, as our session ended, he sat down in front of one of his Macintosh computers to demonstrate a new program he had introduced earlier that morning before the legions of faithful. iTunes was to turn any Macintosh into a digital music player that stored and played CDs or music downloaded from the Internet.
Why Wall Street Matters by William D. Cohan
Apple II, asset-backed security, bank run, Bernie Sanders, Blythe Masters, bonus culture, break the buck, buttonwood tree, corporate governance, corporate raider, creative destruction, Credit Default Swap, Donald Trump, Exxon Valdez, financial innovation, financial repression, Fractional reserve banking, Gordon Gekko, greed is good, income inequality, Joseph Schumpeter, London Interbank Offered Rate, margin call, money market fund, moral hazard, Potemkin village, quantitative easing, secular stagnation, Snapchat, South Sea Bubble, Steve Jobs, Steve Wozniak, too big to fail, WikiLeaks
This is a good thing. Objectively speaking, we learn from the Apple prospectus that there would be no Apple, at least in its present form, without Wall Street. The prospectus explains that Apple had a relatively large group of early investors who supported the company from its inception in 1976, when Steve Jobs and Steve Wozniak, the two founders, “designed, developed and assembled the Apple I, a microprocessor-based computer consisting of a single printed circuit board.” On January 3, 1977, Apple incorporated; three months later, it introduced the Apple II, which was similar to the Apple I but with a keyboard and a plastic cover. For the nine months leading up to the end of September 1977, Apple had a profit of almost $45,000. But Apple had big ambitions, as the prospectus makes clear, and achieving those ambitions required capital.
Most important, at first, were the venture capitalists, such as Venrock Associates, in New York City, which had a 7.6 percent stake in Apple at the time of its IPO, and Arthur Rock, a former banker at Hambrecht & Quist, a small technology-oriented investment bank in San Francisco. Rock had a 1.3 percent stake in Apple. There were other venture capitalists, too, and together they owned another 8.7 percent of Apple before its IPO. As for Jobs, then twenty-five years old, and Wozniak, then thirty years old, they had stakes in Apple of 15 percent and 7.9 percent, respectively. A. C. Markkula Jr., Apple’s chief marketing executive since May 1977 and also the chairman of the board of directors, had a 14 percent stake. Michael Scott, Apple’s short-lived first CEO, bought his stake of nearly 1.3 million shares for a penny a share when he joined Apple in May 1977. The venture capitalists backing Apple did so for one reason: They were hoping to make money.
It doesn’t deserve or warrant extra vilification as a result. For instance, it’s no surprise that Apple would not exist if it weren’t profitable, or weren’t able to convince investors that one day it would be (as companies such as Amazon have been able to do for years). The fact that Apple is one of the most profitable companies in the world enables it to hire the best, the brightest, and the most creative people and pay them well. Apple’s success allows it to buy new equipment and to build new plants—including a space-age, $5 billion circular headquarters in Cupertino, California—and, of course, it allows Apple to design and to build new groundbreaking products, such as the iPod, the iPhone, and the Apple Watch, and to dream about what the future will look like, whether it includes the Apple car or the Apple personal transporter, like The Jetsons.
Loonshots: How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries by Safi Bahcall
accounting loophole / creative accounting, Albert Einstein, Apple II, Apple's 1984 Super Bowl advert, Astronomia nova, British Empire, Cass Sunstein, Charles Lindbergh, Clayton Christensen, cognitive bias, creative destruction, disruptive innovation, diversified portfolio, double helix, Douglas Engelbart, Douglas Engelbart, Edmond Halley, Gary Taubes, hypertext link, invisible hand, Isaac Newton, Johannes Kepler, Jony Ive, knowledge economy, lone genius, Louis Pasteur, Mark Zuckerberg, Menlo Park, Mother of all demos, Murray Gell-Mann, PageRank, Peter Thiel, Philip Mirowski, Pierre-Simon Laplace, prediction markets, pre–internet, Ralph Waldo Emerson, RAND corporation, random walk, Richard Feynman, Richard Thaler, side project, Silicon Valley, six sigma, Solar eclipse in 1919, stem cell, Steve Jobs, Steve Wozniak, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Tim Cook: Apple, tulip mania, Wall-E, wikimedia commons, yield management
WHEN MOSES DOUBLES DOWN The facts of Jobs’s forced exit from Apple in 1985, and his path to the mess at NeXT, have been well laid out. In 1975, Steve Wozniak combined a microprocessor, keyboard, and screen into one of the earliest personal computers. Jobs convinced Wozniak to quit his job and start a company. After some initial success with their Apple I and II, however, competitors quickly passed Apple by. In 1980, Atari and Radio Shack (TRS-80) sold roughly seven times as many computers as Apple. By 1983, Commodore dominated the market, with the IBM PC, launched only two years earlier, a close second. Apple’s share had dropped to less than 10 percent and was shrinking rapidly. Apple’s attempts to win back the spotlight with the Apple III and the Lisa, projects led by Jobs until he lost interest (in one case) or was kicked off (in the other), flopped.
The less-famous history of an ultra-famous icon captures one person’s evolution toward this balance. During Steve Jobs’s first stint at Apple, he called his loonshot group working on the Mac “pirates” or “artists” (he saw himself, of course, as the ultimate pirate-artist). Jobs dismissed the group working on the Apple II franchise as “regular Navy.” The hostility he created between the two groups, by lionizing the artists and belittling the soldiers, was so great that the street between their two buildings was known as the DMZ—the demilitarized zone. The hostility undermined both products. Steve Wozniak, Apple’s cofounder along with Jobs, who was working on the Apple II franchise, left, along with other critical employees; the Mac launch failed commercially; Apple faced severe financial pressure; Jobs was exiled; and John Sculley took over (eventually rescuing the Mac and restoring financial stability).
Jobs proudly and publicly referred to his team, working on the Macintosh, as artists. He referred to the rest of the company, developing the Apple II franchise, as bozos. Apple II engineers took to wearing buttons with a circle and line running through an image of Bozo the Clown. Wozniak, an engineer with the demeanor of a teddy bear, was widely beloved at the company and in the industry. He resigned, openly complaining about the demoralizing attacks. Departures in the Apple II group became so common that one joke ran, “If your boss calls, be sure to get his name.” The toxicity spread. Key designers on the Macintosh side soon began leaving as well. It didn’t take long for the Apple Board of Directors and its recently hired CEO, John Sculley, to conclude the dysfunction was not sustainable. Jobs was stripped of operating responsibility in the spring.
Future Crimes: Everything Is Connected, Everyone Is Vulnerable and What We Can Do About It by Marc Goodman
23andMe, 3D printing, active measures, additive manufacturing, Affordable Care Act / Obamacare, Airbnb, airport security, Albert Einstein, algorithmic trading, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, Bill Joy: nanobots, bitcoin, Black Swan, blockchain, borderless world, Brian Krebs, business process, butterfly effect, call centre, Charles Lindbergh, Chelsea Manning, cloud computing, cognitive dissonance, computer vision, connected car, corporate governance, crowdsourcing, cryptocurrency, data acquisition, data is the new oil, Dean Kamen, disintermediation, don't be evil, double helix, Downton Abbey, drone strike, Edward Snowden, Elon Musk, Erik Brynjolfsson, Filter Bubble, Firefox, Flash crash, future of work, game design, global pandemic, Google Chrome, Google Earth, Google Glasses, Gordon Gekko, high net worth, High speed trading, hive mind, Howard Rheingold, hypertext link, illegal immigration, impulse control, industrial robot, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Jaron Lanier, Jeff Bezos, job automation, John Harrison: Longitude, John Markoff, Joi Ito, Jony Ive, Julian Assange, Kevin Kelly, Khan Academy, Kickstarter, knowledge worker, Kuwabatake Sanjuro: assassination market, Law of Accelerating Returns, Lean Startup, license plate recognition, lifelogging, litecoin, low earth orbit, M-Pesa, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Metcalfe’s law, MITM: man-in-the-middle, mobile money, more computing power than Apollo, move fast and break things, move fast and break things, Nate Silver, national security letter, natural language processing, obamacare, Occupy movement, Oculus Rift, off grid, offshore financial centre, optical character recognition, Parag Khanna, pattern recognition, peer-to-peer, personalized medicine, Peter H. Diamandis: Planetary Resources, Peter Thiel, pre–internet, RAND corporation, ransomware, Ray Kurzweil, refrigerator car, RFID, ride hailing / ride sharing, Rodney Brooks, Ross Ulbricht, Satoshi Nakamoto, Second Machine Age, security theater, self-driving car, shareholder value, Silicon Valley, Silicon Valley startup, Skype, smart cities, smart grid, smart meter, Snapchat, social graph, software as a service, speech recognition, stealth mode startup, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, supply-chain management, technological singularity, telepresence, telepresence robot, Tesla Model S, The Future of Employment, The Wisdom of Crowds, Tim Cook: Apple, trade route, uranium enrichment, Wall-E, Watson beat the top human players on Jeopardy!, Wave and Pay, We are Anonymous. We are Legion, web application, Westphalian system, WikiLeaks, Y Combinator, zero day
In the early days of hacking, it was the telephone system that was the target of hackers’ attention as so-called phone phreaks manipulated the network to avoid the sky-high costs of long-distance calls. Let’s not forget two hackers who spent part of their youth back in 1971 building “blue boxes,” devices capable of hacking the phone network and making free calls: Steve Wozniak and Steve Jobs. The pair sold blue boxes to students at UC Berkeley as a means of making money that would effectively help fund their other small start-up, the Apple computer company. As time passed, other notable hackers emerged, such as Kevin Mitnick and Kevin Poulsen. Mitnick famously broke into the Digital Equipment Corporation’s computers at the age of sixteen and went on to a string of such cyber intrusions, earning him the FBI’s ire and the distinction of being “America’s most wanted hacker.”
There are many private sector professional organizations that could prove immensely helpful in jump-starting such efforts, such as the International Information Systems Security Certification Consortium, or (ISC)2, a nonprofit with over 100,000 certified cybersecurity professionals at the ready, capable of having a positive impact on any such effort should they choose. Crime, Inc. is out there busily recruiting minions for its efforts. Shouldn’t we be doing the same? People of all stripes and backgrounds can help with these endeavors—young, old, and even some hackers who surely have the skill set to make a difference, should they wish to direct their talents for public benefit. As the Apple co-founder Steve Wozniak reminds us, “Some challenging of the rules is good.” We need to help create opportunities, particularly for young people, to channel their considerable talents and energies for good, lest Crime, Inc. engage them for ill. The exponential nature of technology and the linear response of government mean we will need many more hands on deck to help build a safe and stable society that won’t destroy itself.
For the criminal angle, see David Shamah, “Hack Attacks on Infrastructure on the Rise, Experts Say,” Times of Israel, Jan. 30, 2014. 18 President Obama when he noted: Barack Obama, “Remarks by the President on Securing Our Nation’s Cyber Infrastructure,” The White House Office of the Press Secretary, May 29, 2009. 19 Each plays its role: “War in the Fifth Domain,” Economist, July 5, 2010. 20 Let’s not forget two hackers: Phil Lapsley, “The Definitive Story of Steve Wozniak, Steve Jobs, and Phone Phreaking,” Atlantic, Feb. 20, 2013. 21 As time passed, other notable hackers: Kevin D. Mitnick and William L Simon, Ghost in the Wires: My Adventures as the World’s Most Wanted Hacker (New York: Little, Brown, 2012). 22 Poulsen’s ingenious 1990 hack: Jonathan Littman, “The Last Hacker,” Los Angeles Times, Sept. 12, 1993. 23 For example, in October 2013: “Adobe Hack: At Least 38 Million Accounts Breached,” BBC, Oct. 30, 2013. 24 But what changed in that attack: Brian Krebs, “Adobe to Announce Source Code, Customer Data Breach,” Krebs on Security, Oct. 3, 2013. 25 Yep, the company that is selling: Darlene Storm, “AntiSec Leaks Symantec pcAnywhere Source Code After $50K Extortion Not Paid,” Computerworld, Feb. 7, 2012. 26 Traditional organized crime groups: The Hague, Threat Assessment: Italian Organized Crime, Europol Public Information, June 2013; Nir Kshetri, The Global Cybercrime Industry: Economic, Institutional, and Strategic Perspectives (London: Springer, 2010), 1; Chuck Easttom, Computer Crime, Investigation, and the Law (Boston: Cengage Learning, 2010), 206. 27 These newly emerging: Mark Milian, “Top Ten Hacking Countries,” Bloomberg, April 23, 2013. 28 New syndicates: Brian Krebs, “Shadowy Russian Firm Seen as Conduit for Cybercrime,” Washington Post, Oct. 13, 2007; Verisign iDefense, The Russian Business Network: Survey of a Criminal ISP, June 27, 2007. 29 RBN famously provides: Trend Micro, The Business of Cybercrime: A Complex Business Model, Jan. 2010. 30 ShadowCrew operated the now-defunct Web site: Kevin Poulsen, “One Hacker’s Audacious Plan to Rule the Black Market in Stolen Credit Cards,” Wired, Dec. 22, 2008. 31 Founded by the notorious criminal hacker: James Verini, “The Great Cyberheist,” New York Times Magazine, Nov. 10, 2010. 32 The number and reach: John E.
The One Device: The Secret History of the iPhone by Brian Merchant
Airbnb, animal electricity, Apple II, Apple's 1984 Super Bowl advert, citizen journalism, Claude Shannon: information theory, computer vision, conceptual framework, Douglas Engelbart, Dynabook, Edward Snowden, Elon Musk, Ford paid five dollars a day, Frank Gehry, global supply chain, Google Earth, Google Hangouts, Internet of things, Jacquard loom, John Gruber, John Markoff, Jony Ive, Lyft, M-Pesa, MITM: man-in-the-middle, more computing power than Apollo, Mother of all demos, natural language processing, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shock, pattern recognition, peak oil, pirate software, profit motive, QWERTY keyboard, ride hailing / ride sharing, rolodex, Silicon Valley, Silicon Valley startup, skunkworks, Skype, Snapchat, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Tim Cook: Apple, Turing test, uber lyft, Upton Sinclair, Vannevar Bush, zero day
John Draper, another legendary hacker who came to be known as Captain Crunch, found that the pitch of a toy whistle that came free in Cap’n Crunch cereal boxes could be used to open long-distance call lines; he built blue boxes, electronic devices that generated the tone, and demonstrated the technology to a young Steve Wozniak and his friend Steve Jobs. Jobs famously turned the blue boxes into his first ad hoc entrepreneurial effort; Woz built them, and Jobs sold them. The culture of hacking, reshaping, and bending consumer technologies to one’s personal will is as old as the history of those technologies. The iPhone is not immune. In fact, hackers helped push the phone toward adopting its most successful feature, the App Store. The fact that the first iPhones were sold exclusively through AT&T meant that they were, in a sense, a luxury phone. At $499 for the low-end 4G model, they were expensive. Every Apple diehard around the world wanted one immediately, but unless you were willing to sign on with AT&T and you lived in the United States, you were out of luck.
His head is clean-shaven and he sports a thick, graying mustache and a quick, mischievous smile. He grew up in Florida with a love of tinkering and gadgets; he was more Wozniak than Jobs, and experimented with hardware in his spare time. “I was a hacker, and hackers, well, from that era—hackers meant you could build a computer from scratch. So, I was building computers,” he says, some based on the motherboard designs of Steve Wozniak. “It was unfortunate, I’ll call it, to live in Florida, outside of where the [Silicon] Valley stuff was going on.” He graduated with a degree in electrical engineering from the Florida Institute of Technology and went to work for IBM. He stayed at the company for sixteen years, rising through the ranks thanks to his mastery of both hardware and software. In the 1980s, he joined an “advanced research team” that was in charge of engineering IBM’s first laptop computer and making it as small as possible.
But there are other situations—such as when photos that Apple helped law enforcement unlock sent two people who had sexually abused a sixteen-month-old child to prison—that help make a case for Apple’s cooperation. (Which, it should be added, the company has provided in the past: Apple has reportedly opened over seventy iPhones at the behest of law enforcement, though many of those were before the Secure Enclave necessitated a novel software hack from Apple.) There may need to be a mechanism for law enforcement to access this stuff, but how we do that in the age of the Secure Enclave is an open question. For Apple, security is a question of product too. As it moves to promote Apple Pay, internet-of-things apps, and HealthKit, consumers must be confident their data can be kept safe. From a consumer’s perspective, Apple’s decision is win-win; it may be unpopular, but the message is clear: You won’t find a more secure phone anywhere.
It's Better Than It Looks: Reasons for Optimism in an Age of Fear by Gregg Easterbrook
affirmative action, Affordable Care Act / Obamacare, air freight, autonomous vehicles, basic income, Bernie Madoff, Bernie Sanders, Branko Milanovic, business cycle, Capital in the Twenty-First Century by Thomas Piketty, clean water, coronavirus, David Brooks, David Ricardo: comparative advantage, deindustrialization, Dissolution of the Soviet Union, Donald Trump, Elon Musk, Exxon Valdez, factory automation, failed state, full employment, Gini coefficient, Google Earth, Home mortgage interest deduction, hydraulic fracturing, Hyperloop, illegal immigration, impulse control, income inequality, Indoor air pollution, interchangeable parts, Intergovernmental Panel on Climate Change (IPCC), invisible hand, James Watt: steam engine, labor-force participation, liberal capitalism, longitudinal study, Lyft, mandatory minimum, manufacturing employment, Mikhail Gorbachev, minimum wage unemployment, obamacare, oil shale / tar sands, Paul Samuelson, peak oil, plutocrats, Plutocrats, Ponzi scheme, post scarcity, purchasing power parity, quantitative easing, reserve currency, rising living standards, Robert Gordon, Ronald Reagan, self-driving car, short selling, Silicon Valley, Simon Kuznets, Slavoj Žižek, South China Sea, Steve Wozniak, Steven Pinker, supervolcano, The Chicago School, The Rise and Fall of American Growth, the scientific method, There's no reason for any individual to have a computer in his home - Ken Olsen, Thomas Kuhn: the structure of scientific revolutions, Thomas Malthus, transaction costs, uber lyft, universal basic income, War on Poverty, Washington Consensus, WikiLeaks, working poor, Works Progress Administration
Or electronic intelligence might permanently be constrained to running whatever people allow such devices to be connected to. But danger is real. Three generations ago, the advent of thermonuclear explosions appeared to doom humanity; instead, the world has grown more peaceful since then. In the next generation, artificial intelligence may become an existential threat. In 2015, Elon Musk, Martin Rees, Francesca Rossi, Steve Wozniak, and other luminaries of the tech and physics realms warned that artificial intelligence could be a great benefit but also could cause society great harm. The time to impose regulation on artificial intelligence, they said, is now—before chips are capable of thinking for themselves. Laws have mandated basic safety for a range of products, including cars and flying machines. Laws that mandate kill switches for electronic devices are in order.
Japan, the nation with the longest life spans: See the “Life Expectancy” page maintained by the World Health Organization, detailing life expectancy around the globe, at http://www.who.int/gho/mortality_burden_disease/life_tables/situation_trends/en/. The Yale University computer scientist David Gelernter forecast: David Gelernter, The Tides of Mind (New York: Liveright, 2016). Elon Musk, Martin Rees, Francesca Rossi, Steve Wozniak, and other luminaries: See “Research Priorities for Robust and Beneficial Artificial Intelligence,” an open letter with over 8,000 signatories to date, available at Future of Life Institute, https://futureoflife.org/ai-open-letter/. Alan Robock of Rutgers University and Owen Toon of the University of Colorado calculate: Alan Robock and Owen Toon, “The Climate Impacts of Nuclear War,” Bulletin of the Atomic Scientists, 2012.
The United States outproduces China and Russia combined: As elsewhere in this book, this is the standard calculation—employed by the World Bank, the CIA, and other authorities—for gauging GDP by exchange rates. If instead the gauge is purchasing power parity, China looks better. Russia remains an economic basket case under all forms of measurement. As Deirdre McCloskey, an economic historian… has written: Deirdre McCloskey, Bourgeois Equality (Chicago: University of Chicago Press, 2016). In early 2017, the market capitalization of Apple reached $800 billion: Anita Balakrishnan, “Apple Market Cap Tops $800 Billion,” CNBC, May 8, 2017. Most of the world’s great colleges and universities are in the United States: Benjamin Wildavsky, Reinventing Higher Education (Cambridge, MA: Harvard Education Press, 2011). “dissent is not permissible,” The Atlantic said in 2016: James Fallows, “China’s Great Leap Backward,” The Atlantic, December 2016. Military historians tend to conclude: John Keegan, The Face of Battle (London: Jonathan Cape, 1976).
The Attention Merchants: The Epic Scramble to Get Inside Our Heads by Tim Wu
1960s counterculture, Affordable Care Act / Obamacare, AltaVista, Andrew Keen, anti-communist, Apple II, Apple's 1984 Super Bowl advert, barriers to entry, Bob Geldof, borderless world, Brownian motion, Burning Man, Cass Sunstein, citizen journalism, colonial rule, East Village, future of journalism, George Gilder, Golden Gate Park, Googley, Gordon Gekko, housing crisis, informal economy, Internet Archive, Jaron Lanier, Jeff Bezos, jimmy wales, Live Aid, Mark Zuckerberg, Marshall McLuhan, McMansion, Nate Silver, Network effects, Nicholas Carr, placebo effect, post scarcity, race to the bottom, road to serfdom, Saturday Night Live, science of happiness, self-driving car, side project, Silicon Valley, slashdot, Snapchat, Steve Jobs, Steve Wozniak, Steven Levy, Ted Nelson, telemarketer, the built environment, The Chicago School, the scientific method, The Structural Transformation of the Public Sphere, Tim Cook: Apple, Torches of Freedom, Upton Sinclair, upwardly mobile, white flight, zero-sum game
They made easier the entry into the home of not just more consoles, but also home computers, like the Apple II or the Commodore 64, for it was one thing to buy an expensive machine that supposedly would be used for work or programming; it was another to get all that along with the spoonful of sugar, namely, a machine that also came with even better games than the Atari had. In this way video games were arguably the killer app—the application that justifies the investment—of many computers in the home. As a game machine, sometimes used for other purposes, computers had gained their foothold. There they would lie for some time, a sleeping giant.7 * * * * Breakout was written by Apple’s cofounders, Steve Wozniak and Steve Jobs, as a side project, as described in the Master Switch, chapter 20. CHAPTER 16 AOL PULLS ’EM IN In 1991, when Steve Case, just thirty-three years old, was promoted to CEO of AOL, there were four companies, all but one lost to history, that shared the goal of trying to get Americans to spend more leisure time within an abstraction known as an “online computer network.”
While still primitive in various ways, and still offering nothing like the draw of television, the computer, the third screen, had arrived. In the end, AOL was no corporate Ozymandias; though a failure, it would have a lasting and monumental legacy. True to its name, it got America online—reaching out to one another, ready for the biggest attention harvest since television. * * * *1 Before this, personal computers had come in the now unrecognizable form of hobbyist kits, assembled and programmed by guys like Steve Wozniak of Apple. For more, see The Master Switch, 274–75. *2 Example: +++, ATDT (416) 225-9492. *3 The movie also proved an opportunity for the first meetings between AOL and Time Warner executives: Steve Case and Jerry Levin met at a White House screening of the film. See The Master Switch, chapter 19. *4 The “floppy” disk was a magnetic storage medium used in the 1980s and early 1990s, originally the size of a dinner napkin, that was inserted into a “disk drive” that resembled a toaster.
CHAPTER 28: WHO’S BOSS HERE? 1. Matthew Panzarino, “Apple’s Tim Cook Delivers Blistering Speech on Encryption, Privacy,” TechCrunch, June 2, 2015, http://techcrunch.com/2015/06/02/apples-tim-cook-delivers-blistering-speech-on-encryption-privacy/. 2. Ibid. 3. Tim Cook, “Apple’s Commitment to Your Privacy,” Apple, http://www.apple.com/privacy/. 4. Robin Anderson, Consumer Culture and TV Programming (Boulder, CO: Westview Press, 1995). 5. Richard Serra and Carlota Fay Schoolman, “Television Delivers People,” Persistence of Vision—Volume 1: Monitoring the Media (1973), video. 6. Farhad Manjoo, “What Apple’s Tim Cook Overlooked in His Defense of Privacy,” New York Times, June 10, 2015, http://www.nytimes.com/2015/06/11/technology/what-apples-tim-cook-overlooked-in-his-defense-of-privacy.html?
Surveillance Valley: The Rise of the Military-Digital Complex by Yasha Levine
23andMe, activist fund / activist shareholder / activist investor, Airbnb, AltaVista, Amazon Web Services, Anne Wojcicki, anti-communist, Apple's 1984 Super Bowl advert, bitcoin, borderless world, British Empire, call centre, Chelsea Manning, cloud computing, collaborative editing, colonial rule, computer age, computerized markets, corporate governance, crowdsourcing, cryptocurrency, digital map, don't be evil, Donald Trump, Douglas Engelbart, Douglas Engelbart, drone strike, Edward Snowden, El Camino Real, Electric Kool-Aid Acid Test, Elon Musk, fault tolerance, George Gilder, ghettoisation, global village, Google Chrome, Google Earth, Google Hangouts, Howard Zinn, hypertext link, IBM and the Holocaust, index card, Jacob Appelbaum, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Julian Assange, Kevin Kelly, Kickstarter, life extension, Lyft, Mark Zuckerberg, market bubble, Menlo Park, Mitch Kapor, natural language processing, Network effects, new economy, Norbert Wiener, packet switching, PageRank, Paul Buchheit, peer-to-peer, Peter Thiel, Philip Mirowski, plutocrats, Plutocrats, private military company, RAND corporation, Ronald Reagan, Ross Ulbricht, Satoshi Nakamoto, self-driving car, sentiment analysis, shareholder value, side project, Silicon Valley, Silicon Valley startup, Skype, slashdot, Snapchat, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Telecommunications Act of 1996, telepresence, telepresence robot, The Bell Curve by Richard Herrnstein and Charles Murray, The Hackers Conference, uber lyft, Whole Earth Catalog, Whole Earth Review, WikiLeaks
He also launched the Good Business Network, a corporate consulting company that applied his counterculture public relations strategies to problems faced by clients such as Shell Oil, Morgan Stanley, Bechtel, and DARPA.31 He also organized an influential computer conference that brought together leading computer engineers and journalists.32 It was called, simply, “Hackers’ Conference” and was held in Marin County in 1984. About 150 of the country’s top computer geniuses attended, including Apple’s Steve Wozniak. Brand cleverly stage-managed the event to give the group maximum cultural cachet. To hear him and other believers tell it, the event was the “Woodstock of the computer elite!” Newspaper accounts regaled readers with tales of strange nerds with fantastical visions of the future. “Giving a computer self-hood. The greatest hack is artificial consciousness,” one attendee told a Washington Post reporter.
The Electronic Frontier Foundation was founded by Lotus Notes creator Mitch Kapor, cattle rancher and Grateful Dead songwriter John Perry Barlow, and early Sun Microsystems employee John Gilmore. It started out with a vague mission: to defend people’s civil liberties on the Internet and to “find a way of preserving the ideology of the 1960s” in the digital era. From its first days, EFF had deep pockets and featured an impressive roster: Stewart Brand and Apple’s Steve Wozniak were board members, while press outreach was conducted by Cathy Cook, who had done public relations for Steve Jobs. It did not take long for EFF to find its calling: lobbying Congress on behalf of the budding Internet service providers that came out of the NSFNET network and pushing for a privatized Internet system, where the government stayed pretty much out of the way—“Designing the Future Net” is how EFF’s Barlow described it. 103.
It also launched the cyberpunk movement, which responded to Gibson’s political critique in a cardinally different manner: it cheered the coming of this cyber dystopia. Computers and hackers were countercultural rebels taking on power. They were cool. That same year, Apple Computer released its “1984” ad for the Macintosh. Directed by Ridley Scott, who had just wowed audiences with the dystopian hit Blade Runner, and aired during the Super Bowl, Apple’s message could not have been more clear: forget what you know about IBM or corporate mainframes or military computer systems. With Apple at the helm, personal computers are the opposite of what they used to be: they are not about domination and control but about individual rebellion and empowerment. “In a striking departure from the direct, buy-this-product approach of most American corporations, Apple Computer introduced its new line of personal computers with the provocative claim that Macintosh would help save the world from the lockstep society of George Orwell’s novel,” reported the New York Times.36 Interestingly, the paper pointed out that the “1984” ad had grown out of another campaign that the company had abandoned but that had explicitly talked about the ability to misuse computers.
Blockchain Revolution: How the Technology Behind Bitcoin Is Changing Money, Business, and the World by Don Tapscott, Alex Tapscott
Airbnb, altcoin, asset-backed security, autonomous vehicles, barriers to entry, bitcoin, blockchain, Blythe Masters, Bretton Woods, business process, buy and hold, Capital in the Twenty-First Century by Thomas Piketty, carbon footprint, clean water, cloud computing, cognitive dissonance, commoditize, corporate governance, corporate social responsibility, creative destruction, Credit Default Swap, crowdsourcing, cryptocurrency, disintermediation, disruptive innovation, distributed ledger, Donald Trump, double entry bookkeeping, Edward Snowden, Elon Musk, Erik Brynjolfsson, Ethereum, ethereum blockchain, failed state, fiat currency, financial innovation, Firefox, first square of the chessboard, first square of the chessboard / second half of the chessboard, future of work, Galaxy Zoo, George Gilder, glass ceiling, Google bus, Hernando de Soto, income inequality, informal economy, information asymmetry, intangible asset, interest rate swap, Internet of things, Jeff Bezos, jimmy wales, Kickstarter, knowledge worker, Kodak vs Instagram, Lean Startup, litecoin, Lyft, M-Pesa, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, means of production, microcredit, mobile money, money market fund, Network effects, new economy, Oculus Rift, off grid, pattern recognition, peer-to-peer, peer-to-peer lending, peer-to-peer model, performance metric, Peter Thiel, planetary scale, Ponzi scheme, prediction markets, price mechanism, Productivity paradox, QR code, quantitative easing, ransomware, Ray Kurzweil, renewable energy credits, rent-seeking, ride hailing / ride sharing, Ronald Coase, Ronald Reagan, Satoshi Nakamoto, Second Machine Age, seigniorage, self-driving car, sharing economy, Silicon Valley, Skype, smart contracts, smart grid, social graph, social intelligence, social software, standardized shipping container, Stephen Hawking, Steve Jobs, Steve Wozniak, Stewart Brand, supply-chain management, TaskRabbit, The Fortune at the Bottom of the Pyramid, The Nature of the Firm, The Wisdom of Crowds, transaction costs, Turing complete, Turing test, Uber and Lyft, uber lyft, unbanked and underbanked, underbanked, unorthodox policies, wealth creators, X Prize, Y2K, Zipcar
IBM embraced Linux and donated hundreds of millions of dollars’ worth of software to the Linux community. In doing so, IBM saved $900 million a year developing its own proprietary systems and created a platform on which it built a multibillion-dollar software and services business. Experience shows that long-term sustainability of volunteer communities can be challenging. In fact, some of the more successful communities have found ways to compensate members for their hard work. As Steve Wozniak said to Stewart Brand, “Information should be free, but your time should not.”19 In the case of Linux, most of the participants get paid by companies like IBM or Google to ensure that Linux meets their strategic needs. Linux is still an example of social production. Benkler told us, “The fact that some developers are paid by third parties to participate does not change the governance model of Linux, or the fact that it is socially developed.”
Or it could release the private data from other servers or hold the data hostage until we human owners paid a ransom. Once machines have intelligence and the ability to learn, how quickly will they become autonomous? Will military drones and robots, for example, decide to turn on civilians? According to researchers in AI, we’re only years, not decades, away from the realization of such weapons. In July 2015, a large group of scientists and researchers, including Stephen Hawking, Elon Musk, and Steve Wozniak, issued an open letter calling for a ban on the development of autonomous offensive weapons beyond meaningful human control.53 “The nightmare headline for me is, ‘100,000 Refrigerators Attack Bank of America,’” said Vint Cerf, widely regarded as the father of the Internet. “That is going to take some serious thinking not only about basic security and privacy technology, but also how to configure and upgrade devices at scale,” he added, noting that no one wants to spend their entire weekend typing IP addresses for each and every household device.54 We do not recommend broad regulation of DAEs and the IoT or regulatory approvals.
“Issuing fungible assets like equities, bonds, and currencies on the blockchain and building the necessary infrastructure to scale it and make it commercial don’t require a banker’s CV,” he said. For one, “You don’t require all the legacy infrastructure or institutions that make up Wall Street today. . . . Not only can you issue these assets on the blockchain, but you can create systems where I can have an instantaneous atomic transaction where I might have Apple stock in my wallet and I want to buy something from you. But you want dollars. With this platform I can enter a single atomic transaction (i.e., all or none) and use my Apple stock to send you dollars.”48 Is it really that easy? The battle to reinvent the financial services industry differs from the battle for e-commerce in the early days of the Web. For businesses like Allaire’s to scale, they must facilitate one of the largest value transfers in human history, moving trillions of dollars from millions of traditional bank accounts to millions of Circle wallets.
The Rise of the Network Society by Manuel Castells
"Robert Solow", Apple II, Asian financial crisis, barriers to entry, Big bang: deregulation of the City of London, Bob Noyce, borderless world, British Empire, business cycle, capital controls, complexity theory, computer age, computerized trading, creative destruction, Credit Default Swap, declining real wages, deindustrialization, delayed gratification, dematerialisation, deskilling, disintermediation, double helix, Douglas Engelbart, Douglas Engelbart, edge city, experimental subject, financial deregulation, financial independence, floating exchange rates, future of work, global village, Gunnar Myrdal, Hacker Ethic, hiring and firing, Howard Rheingold, illegal immigration, income inequality, Induced demand, industrial robot, informal economy, information retrieval, intermodal, invention of the steam engine, invention of the telephone, inventory management, James Watt: steam engine, job automation, job-hopping, John Markoff, knowledge economy, knowledge worker, labor-force participation, laissez-faire capitalism, Leonard Kleinrock, longitudinal study, low skilled workers, manufacturing employment, Marc Andreessen, Marshall McLuhan, means of production, megacity, Menlo Park, moral panic, new economy, New Urbanism, offshore financial centre, oil shock, open economy, packet switching, Pearl River Delta, peer-to-peer, planetary scale, popular capitalism, popular electronics, post-industrial society, postindustrial economy, prediction markets, Productivity paradox, profit maximization, purchasing power parity, RAND corporation, Robert Gordon, Robert Metcalfe, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, social software, South China Sea, South of Market, San Francisco, special economic zone, spinning jenny, statistical model, Steve Jobs, Steve Wozniak, Ted Nelson, the built environment, the medium is the message, the new new thing, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, total factor productivity, trade liberalization, transaction costs, urban renewal, urban sprawl, zero-sum game
In 1975, Ed Roberts, an engineer who had created a small calculator company, MITS, in Albuquerque, New Mexico, built a computing box with the improbable name of Altair, after a character in the Star Trek TV series, that was the object of admiration of the inventor’s young daughter. The machine was a primitive object, but it was built as a small-scale computer around a microprocessor. It was the basis for the design of Apple I, then of Apple II, the first commercially successful micro-computer, realized in the garage of their parents’ home by two young school drop-outs, Steve Wozniak and Steve Jobs, in Menlo Park, Silicon Valley, in a truly extraordinary saga that has by now become the founding legend of the Information Age. Launched in 1976, with three partners and $91,000 capital, Apple Computers had by 1982 reached $583 million in sales, ushering in the age of diffusion of computer power. IBM reacted quickly: in 1981 it introduced its own version of the microcomputer, with a brilliant name: the Personal Computer (PC), which became in fact the generic name for microcomputers.
They gathered in loose groups, to exchange ideas and information on the latest developments. One such gathering was the Home Brew Computer Club, whose young visionaries (including Bill Gates, Steve Jobs, and Steve Wozniak) would go on to create in the following years up to 22 companies, including Microsoft, Apple, Comenco, and North Star. It was the club’s reading, in Popular Electronics, of an article reporting Ed Roberts’s Altair machine which inspired Wozniak to design a microcomputer, Apple I, in his Menlo Park garage in the summer of 1976. Steve Jobs saw the potential, and together they founded Apple, with a $91,000 loan from an Intel executive, Mike Markkula, who came in as a partner. At about the same time Bill Gates founded Microsoft to provide the operating system for microcomputers, although he located his company in 1978 in Seattle to take advantage of the social contacts of his family.
See, for instance, Kranzberg’s (1992) acceptance speech of the award of honorary membership in NASTS. 5 Bijker et al. (1987). 6 There is still to be written a fascinating social history of the values and personal views of some of the key innovators of the 1970s’ Silicon Valley revolution in computer technologies. But a few indications seem to point to the fact that they were intentionally trying to undo the centralizing technologies of the corporate world, both out of conviction and as their market niche. As evidence, I recall the famous Apple Computer 1984 advertising spot to launch Macintosh, in explicit opposition to Big Brother IBM of Orwellian mythology. As for the countercultural character of many of these innovators, I shall also refer to the life story of the genius developer of the personal computer, Steve Wozniak: after quitting Apple, bored by its transformation into another multinational corporation, he spent a fortune for a few years subsidizing rock groups that he liked, before creating another company to develop technologies of his taste. At one point, after having created the personal computer, Wozniak realized that he had no formal education in computer sciences, so he enrolled at UC Berkeley.
Dealers of Lightning by Michael A. Hiltzik
Apple II, Apple's 1984 Super Bowl advert, beat the dealer, Bill Duvall, Bill Gates: Altair 8800, business cycle, computer age, creative destruction, Douglas Engelbart, Dynabook, Edward Thorp, El Camino Real, index card, Jeff Rulifson, John Markoff, Joseph Schumpeter, Marshall McLuhan, Menlo Park, oil shock, popular electronics, Robert Metcalfe, Ronald Reagan, Silicon Valley, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, the medium is the message, Vannevar Bush, Whole Earth Catalog, zero-sum game
A company, say, like Apple. The idea was not wholly implausible. Apple was coming on strong. Started in the proverbial Silicon Valley garage by Jobs and his high school classmate Steve Wozniak, Apple had successfully negotiated the transition in its product line from kit versions of Woz’s little personal computer to a more versatile version, the Apple II. This machine was unique in the hobbyist market. It came already assembled, with a keyboard (although it required a separate monitor). Shortly after Jobs’s appearance before Zarem’s group, Apple started bundling it with VisiCalc, a unique software program known as a financial spreadsheet—a “killer app” that would single-handedly turn the Apple II into a popular businessman’s tool. With fewer than forty employees in 1978, Apple was already one of the most sought-after investments among the small community of speculative private investors known as venture capitalists.
January: The Altair 8800, a hobbyist’s personal computer sold as a mail-order kit, is featured on the cover of Popular Electronics, enthralling a generation of youthful technology buffs—among them, Bill Gates—with the possibilities of personal computing. February: PARC engineers demonstrate for their colleagues a graphical user interface for a personal computer, including icons and the first use of pop-up menus, that will develop into the Windows and Macintosh interfaces of today. March 1: PARC’s permanent headquarters at 3333 Coyote Hill Road are formally opened. January 3: Apple Computer is incorporated by Steve Jobs and Steve Wozniak. August: Having perfected a new technology for designing high-density computer chips at PARC, Lynn Conway and Carver Mead begin drafting Introduction to VLSI Systems, a textbook on the technology that is written and typeset entirely on desktop publishing systems invented at the center. August 18: Xerox shelves a plan to market the Alto as a commercial project, closing the door to any possibility that the company will be in the vanguard of personal computing.
Those were the qualities that enabled him to hold the experienced investors of XDCrapt by relating the story of how he had founded Apple. Those, and the fact that at the age of twenty-four he was the chairman of a company already worth $70 million. A small handful of PARC engineers, like Larry Tesler, had not allowed their preconceptions about Apple’s customers or Jobs’s personality to cloud their perception of where these little computers might lead. Rather than shun the growing underground of youthful hackers, Tesler dove in. For a year or two he had been attending such cultural events as meetings of the Homebrew Computer Club, where young Altair and Commodore users met to trade their tiny software programs and swap lore. He was no stranger to Apple, having gone out with a woman who worked for the company. “I’d been to an Apple picnic as her date in 1978, when there were thirty employees,” Tesler recalled.
Start It Up: Why Running Your Own Business Is Easier Than You Think by Luke Johnson
Albert Einstein, barriers to entry, Bernie Madoff, business cycle, collapse of Lehman Brothers, corporate governance, corporate social responsibility, creative destruction, credit crunch, Grace Hopper, happiness index / gross national happiness, high net worth, James Dyson, Jarndyce and Jarndyce, Jarndyce and Jarndyce, Kickstarter, mass immigration, mittelstand, Network effects, North Sea oil, Northern Rock, patent troll, plutocrats, Plutocrats, Ponzi scheme, profit motive, Ralph Waldo Emerson, Silicon Valley, software patent, stealth mode startup, Steve Jobs, Steve Wozniak, The Wealth of Nations by Adam Smith, traveling salesman, tulip mania, Vilfredo Pareto, wealth creators
They usually get married, have a family and develop a new set of priorities in life. Spouses and children become far more important to a founder than their business and the partnership that created it. They make some money, the hunger and ambition abate, and perhaps they decide to give up all the striving for a more settled life. Illness can also intervene. At both Microsoft and Apple, a single founder of each remains involved and famous: Bill Gates and Steve Jobs. But in each case there was a co-founder who dropped out through ill-health (Paul Allen and Steve Wozniak respectively). The fact is that running large organizations takes real stamina and many find the intensity and responsibilities too onerous. Failure tends to bring out the knives. Everyone starts blaming someone else for the problems. It amazes me how often chief executives get away with the argument that they were not money men, and that the finance director was the only person who understood what went wrong and why the cash ran out.
We might have been sacked from a job, an account, a project, or by a client. But life continues, new opportunities arise. ‘That which doesn’t kill you makes you stronger’ Friedrich Nietzsche I have been given the boot on more than one occasion, but the experience has only encouraged me to try harder. Steve Jobs said: ‘Getting fired from Apple was the best thing that could have happened to me.’ He went off and founded neXT, then Pixar, and then returned to Apple, and made it vastly more successful than it had ever been. For him, losing his role at the company he founded was a stimulus to make a new start. I can empathize. As a stockbroking analyst in the 1980s I was passed over for promotion by my then boss (who went on to become a chairman of insurance giant Prudential). I don’t really blame him – I was never cut out to be a bank employee.
If you are ambitious, the part-time option should only ever be temporary. To avoid being forever in stealth mode, you should have a ‘boat-burning’ target: a clearly defined point at which you chuck the day job and dive in. Don’t tweak your fledgling business until it seems like a sure-fire bet: it never will be. It’s very easy to tinker away on the margins for ever but, as Steve Jobs once said to a perfectionist engineer at Apple, ‘real artists ship’. The online revolution has made moonlighting easier than ever. As long as there is no conflict with your principal job, why shouldn’t such an arrangement be successful? Of course, a part-time enterprise will have to become your hobby and consume your holidays. You must steel yourself for the prospect of 100-hour working weeks. It will test your determination: if you don’t have the energy and commitment to put that effort in, don’t expect the journey to get easier if you go full-time.
Information Doesn't Want to Be Free: Laws for the Internet Age by Cory Doctorow, Amanda Palmer, Neil Gaiman
Airbnb, barriers to entry, Brewster Kahle, cloud computing, Dean Kamen, Edward Snowden, game design, Internet Archive, John von Neumann, Kickstarter, MITM: man-in-the-middle, optical character recognition, plutocrats, Plutocrats, pre–internet, profit maximization, recommendation engine, rent-seeking, Saturday Night Live, Skype, Steve Jobs, Steve Wozniak, Stewart Brand, transfer pricing, Whole Earth Catalog, winner-take-all economy
It’s up to creators everywhere to engage with their colleagues about the ways that expanded liability for intermediaries drive us all into the old-media companies’ corrals, where they get to make the rules, pick the winners, and run the show. 3. DOCTOROW’S THIRD LAW Information Doesn’t Want to Be Free, People Do BACK IN 1984, Stewart Brand—founder of the Whole Earth Catalog—had a public conversation with Apple cofounder Steve Wozniak at the first Hackers Conference. There, Brand uttered a few dozen famous words: “On the one hand, information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other.”
Blu-ray’s keys are 128 bits long—you could spraypaint one of them onto a smallish wall. Is Apple for or against digital locks? One of the world’s most successful digital-lock vendors is Apple. Despite public pronouncements from its late cofounder, Steve Jobs, condemning DRM, Apple has deployed digital locks in nearly every corner of its business. The popular iOS devices—the iPod, iPhone, and iPad—all use DRM that ensures that only software bought through Apple’s store can run on them. (Apple gets 30 percent of the purchase price of such software, and another 30 percent of any in-app purchases you make afterward.) Apple’s iTunes Store, meanwhile, sells all its digital video and audiobooks with DRM. Many people assume that this is at publishers’ insistence, but it’s not so: when Random House Audio published the audiobook of my novel Little Brother, Apple refused to carry it without DRM. 1.4 Digital Locks Always Break DIGITAL-LOCK VENDORS TEND to focus on how hard their technology is to beat if you attack it where it’s strongest.
The labels came to realize that they’d been caught in yet another roach motel: their customers had bought millions of dollars’ worth of Apple-locked music, and if the labels left the iTunes Store, the listeners would be hard-pressed to follow them. Just to make this very clear, Apple threatened a competitor, RealNetworks, when Real released a version of its player that allowed users to load (digitally locked) songs bought from the RealPlayer store onto an iPod, enabling customers to play both Real’s and Apple’s music on the same device. “We are stunned that RealNetworks has adopted the tactics and ethics of a hacker to break into the iPod, and we are investigating the implications of their actions under the DMCA and other laws,” Apple said. But Amazon offered the labels a lateral move: give up on digital rights management (DRM) software and sell your music as “unprotected” MP3s (which also play on iPods), and you can start to wean your customers off the iTunes Store—or at least weaken its whip-hand over your business.
Track Changes by Matthew G. Kirschenbaum
active measures, Apple II, Apple's 1984 Super Bowl advert, Bill Gates: Altair 8800, Buckminster Fuller, commoditize, computer age, corporate governance, David Brooks, dematerialisation, Donald Knuth, Douglas Hofstadter, Dynabook, East Village, en.wikipedia.org, feminist movement, forensic accounting, future of work, Google Earth, Gödel, Escher, Bach, Haight Ashbury, HyperCard, Jason Scott: textfiles.com, Joan Didion, John Markoff, John von Neumann, Kickstarter, low earth orbit, mail merge, Marshall McLuhan, Mother of all demos, New Journalism, Norman Mailer, pattern recognition, pink-collar, popular electronics, RAND corporation, rolodex, Ronald Reagan, self-driving car, Shoshana Zuboff, Silicon Valley, social web, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Ted Nelson, text mining, thinkpad, Turing complete, Vannevar Bush, Whole Earth Catalog, Y2K, Year of Magical Thinking
Nonetheless, at the very height of this period she found time to help start a Kaypro users’ group called Bad Sector, the same name she had unceremoniously given to her first computer. Users’ groups were one of the fixtures of early computer culture. The first and most famous of them all was the Homebrew Computer Club, which met in the auditorium of Stanford University’s linear accelerator. Adam Osborne was a member, and at one of those now quasi-legendary gatherings Steve Wozniak had demonstrated his prototype for what eventually became the Apple II computer. But users’ groups were all very “homebrew”; they tended to coalesce organically, their members finding each other through notices tacked up in computer shops (or on virtual bulletin boards), ads in newsletters, and word of mouth. Typically they were tied together by an interest in a common system or product. Members would swap tips and help one another troubleshoot.
(Lexitron’s was the first in 1971; Vydec’s in 1974 was the first to display a full page of text on the screen.) Beeching’s fanciful scenario would thus have already been familiar—mundane—to any office secretary who had been trained on Lexitron or its competitors’ equipment.2 The creation of a working “TV Typewriter” (TVT) also soon became a rite of passage for the home computer hobbyist. It was a key stepping stone for Steve Wozniak on the way to the Apple computer, and it featured prominently in the announcement for the first meeting of the Homebrew Computer Club: “Are you building your own computer? Terminal? T V Typewriter? I O [input/output] device? Or some other digital black-magic box?” read the ad that was posted around Silicon Valley in February 1975.3 Computers themselves, of course, compute: which is to say they work by fundamentally arithmetical principles.
Writing Track Changes (mostly in Word, on a couple of small, lightweight laptops) has taken me back to a time in my youth that I recognize only in retrospect as the pivotal moment in the growth and widespread adoption of word processing for literature. Many of the writers I stayed up late reading as a teenager—best-selling authors like Stephen King, Frank Herbert, Anne McCaffrey, and Tom Clancy—were themselves experimenting with the technology, getting their own first computers at more or less the same time we got our Apple. (Clancy got an Apple IIe himself, in fact.) All of this explains one of the attractions of this project to me: a chance to explore some of the connections that tied me, a kid in my upstairs bedroom poking around with the Bank Street Writer or Apple BASIC, to the writers I idolized—who were also, it turns out, wrestling with the very same technology themselves. They were learning the same jargon and terminology that I was, scratching their heads over dot matrix and ink jet, and struggling to initialize a diskette or recover a file they had just inadvertently deleted.
The End of Jobs: Money, Meaning and Freedom Without the 9-To-5 by Taylor Pearson
"side hustle", Airbnb, barriers to entry, Ben Horowitz, Black Swan, call centre, cloud computing, commoditize, creative destruction, David Heinemeier Hansson, Elon Musk, en.wikipedia.org, Frederick Winslow Taylor, future of work, Google Hangouts, Kevin Kelly, Kickstarter, knowledge economy, knowledge worker, loss aversion, low skilled workers, Lyft, Marc Andreessen, Mark Zuckerberg, market fragmentation, means of production, Oculus Rift, passive income, passive investing, Peter Thiel, remote working, Ronald Reagan: Tear down this wall, sharing economy, side project, Silicon Valley, Skype, software as a service, software is eating the world, Startup school, Steve Jobs, Steve Wozniak, Stewart Brand, telemarketer, Thomas Malthus, Uber and Lyft, uber lyft, unpaid internship, Watson beat the top human players on Jeopardy!, web application, Whole Earth Catalog
Instead of a large, up-front investment in hiring and training someone who may or may not be good enough for the role, you’re able to make a small investment, over time, in someone that has been vetted by other people in your industry. Self-Education: Information Wants to Be Free In 1984, at the first Hackers Conference, Whole Earth Catalog founder Stewart Brand was overheard telling Apple co-founder Steve Wozniak the now iconic phrase: “Information wants to be free.” The internet has done more to facilitate information transparency than any technology since the printing press. Knowledge that used to be opaque and hard to source is often now just a Google search away. Scott Young, a young entrepreneur who now teaches others about advanced learning strategies, put himself through the entire MIT course material in twelve months for two thousand dollars.
He is engaged in a dialogue with his reality asking “why” and “why not” instead of “how” or “what.”52 The degree to which we’re able to design our reality is directly related to our quality of life, freedom, and wealth. Those that design reality have a higher quality of all factors in their life, and through designing their reality they enable others to do the same by creating more wealth. In designing my reality in the form of an iPhone, Steve Jobs and Apple created more power, freedom, and wealth for me than Rockefeller had. This leads to an upward spiral. As wealth is increasing, so is our ability to design our realities. PhD or Podcast? While it’s always been true that great work comes from those who freely choose it and that the ability to design our realities creates more freedom, it’s the changes we’ve seen over the past decade that made both of those radically more accessible and safer.
ref=business&_r=1& Chapter 1 5. http://www.chrisducker.com/how-much-do-i-pay-my-virtual-assistant/ 6. http://dazeinfo.com/2014/10/28/1-5-million-engineering-pass-outs-india-every-year-fewer-getting-hired-trends/ 7. http://www.engineeringuk.com/View/?con_id=360 8. http://www.oecd.org/edu/50495363.pdf 9. http://www.forbes.com/sites/michaelzakkour/2014/04/30/copycat-china-still-a-problem-for-brands-chinas-future-just-ask-apple-hyatt-starbucks/ 10. Example borrowed from Antifragile by Nassim Taleb 11. Peter Drucker (1974) Management: tasks, responsibilities, practices. p. 181. Source: http://bit.ly/1K8KvP4 12. Author Interview with Jesse Lawler. To listen to Jesse’s interview, please visit taylorpearson.me/eoj 13. For further reading on how power distribution effects, see The Dictator’s Handbook: http://www.amazon.com/The-Dictators-Handbook-Behavior-Politics/dp/1610391845 Chapter 2 14. http://www.washingtonpost.com/wp-dyn/content/article/2008/02/22/AR2008022202283_2.html?
Big Mistakes: The Best Investors and Their Worst Investments by Michael Batnick
activist fund / activist shareholder / activist investor, Airbnb, Albert Einstein, asset allocation, bitcoin, Bretton Woods, buy and hold, buy low sell high, cognitive bias, cognitive dissonance, Credit Default Swap, cryptocurrency, Daniel Kahneman / Amos Tversky, endowment effect, financial innovation, fixed income, hindsight bias, index fund, invention of the wheel, Isaac Newton, John Meriwether, Kickstarter, Long Term Capital Management, loss aversion, mega-rich, merger arbitrage, Myron Scholes, Paul Samuelson, quantitative easing, Renaissance Technologies, Richard Thaler, Robert Shiller, Robert Shiller, Snapchat, Stephen Hawking, Steve Jobs, Steve Wozniak, stocks for the long run, transcontinental railway, value at risk, Vanguard fund, Y Combinator
“I'm going to hold onto this fund that's done horribly because I can't stand the thought of selling at the bottom,” and it can compel us to do something because we don't want to regret not doing it: “I'm going to buy this ICO (initial coin offering) because I won't be able to live with myself if I miss the next Bitcoin.” You know Steve Jobs and his early partner Steve Wozniak, but the name Ronald Wayne likely means nothing to you. Wayne was the third founder of Apple, but the reason his name is erased from the history books is because in 1976 he sold his 10% stake in the company for $800.4 Apple is currently worth north of $900 billion! You're never going to experience anything quite this painful, but the odds are high that at some point in time, you'll pass on an investment that goes on to deliver fantastic results. You cannot avoid regrets in this game. You'll buy stuff you wish you hadn't and sell things you wish you held onto.
If you drop an eight‐sided ball, there's no way to predict which way it would bounce. The same idea holds true in finance – Serotonin plus adrenaline plus different time horizons times a few million participants equals literally nobody knows. Let's pretend that we knew with complete certainty that Apple's earnings will grow by 8% a year for the next decade. Would this give you the confidence to buy its stock? It shouldn't, and here's why. How fast is the overall market growing and how fast are investors expecting Apple to grow? Even if we have clairvoyance on the most important driver of long‐term returns, earnings, it wouldn't be enough to ensure success. The missing ingredient, which cannot be modeled by all the PhDs in the world, is investor's moods and expectations. Investing with perfect information is difficult – investing with imperfect information and cognitive biases has made mincemeat out of millions of investors.
Buffett did not become one of the richest men in the world by spreading his bets across his top 100 ideas. Berkshire is in that rare group of stocks that is responsible for the majority of the market's long‐term gains. The distribution of total stock market returns is heavily skewed toward these giant winners. The top 1,000 stocks alone, or less than 4% of the total public companies since 1926, have accounted for all of the market's gains. Exxon Mobil, Apple, Microsoft, General Electric, and IBM have each generated over half a trillion dollars in shareholder wealth.2 The hunt for these potentially life‐changing stocks motivates millions of market participants each day. But for every Berkshire Hathaway, there is a Sears Holdings, a GoPro for every IBM. While “concentrate to get rich” is certainly true, it's not wise financial advice. The stocks that produce these gigantic returns always appear obvious in hindsight, but in real time, finding and holding them is harder than hitting a 100 mph fastball.
Overcomplicated: Technology at the Limits of Comprehension by Samuel Arbesman
algorithmic trading, Anton Chekhov, Apple II, Benoit Mandelbrot, citation needed, combinatorial explosion, Danny Hillis, David Brooks, digital map, discovery of the americas, en.wikipedia.org, Erik Brynjolfsson, Flash crash, friendly AI, game design, Google X / Alphabet X, Googley, HyperCard, Inbox Zero, Isaac Newton, iterative process, Kevin Kelly, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, mandelbrot fractal, Minecraft, Netflix Prize, Nicholas Carr, Parkinson's law, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman: Challenger O-ring, Second Machine Age, self-driving car, software studies, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, Stewart Brand, superintelligent machines, Therac-25, Tyler Cowen: Great Stagnation, urban planning, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Y2K
A self-taught genius who worked during the early part of the twentieth century, Ramanujan was not your average mathematician who tried to solve problems through trial and error and occasional flashes of brilliance. Instead, equations seemed to leap fully formed from his brain, often mind-bogglingly complex and stunningly correct (though some were also wrong). The Ramanujan of technology might be Steve Wozniak. Wozniak programmed the first Apple computer and was responsible for every aspect of the Apple II. As the programmer and novelist Vikram Chandra notes, “Every piece and bit and byte of that computer was done by Woz, and not one bug has ever been found. . . . Woz did both hardware and software. Woz created a programming language in machine code. Woz is hardcore.” Wozniak was on a level of technological understanding that few can reach. We can even see the extremes of our brain’s capacity—as well as how its limits can be stretched—in the way London cabdrivers acquire and use what is known as The Knowledge.
abstraction, 163 biological thinking’s avoidance of, 115–16 in complexity science, 133, 135 in physics thinking, 115–16, 121–22, 128 specialization and, 24, 26–27 technological complexity and, 23–28, 81, 121–22 accretion, 65 in complex systems, 36–43, 51, 62, 65, 191 in genomes, 156 in infrastructure, 42, 100–101 legacy systems and, 39–42 in legal system, 40–41, 46 in software, 37–38, 41–42, 44 in technological complexity, 130–31 unexpected behavior and, 38 aesthetics: biological thinking and, 119 and physics thinking, 113, 114 aggregation, diffusion-limited, 134–35 algorithm aversion, 5 Amazon, 5 American Philosophical Society, 90 Anaximander of Miletus, 139 Apple, 161, 163 Apple II computer, 77 applied mathematics, 143 arche, 140 Ariane 5 rocket, 1996 explosion of, 11–12 Aristotle, 151 Ascher, Kate, 100 Asimov, Isaac, 124 atomic nucleus, discovery of, 124, 141 Audubon, John James, 109 autocorrect, 5, 16 automobiles: self-driving, 91, 231–32 software in, 10–11, 13, 45, 65, 100, 174 see also Toyota automobiles Autonomous Technology (Winner), 22 Average Is Over (Cowen), 84 awe, as response to technological complexity, 6, 7, 154–55, 156, 165, 174 bacteria, 124–25 Balkin, Jack, 60–61 Ball, Philip, 12, 87–88, 136, 140 Barr, Michael, 10 Barrow, Isaac, 89 BASIC, 44–45 Bayonne Bridge, 46 Beacock, Ian, 12–13 Benner, Steven, 119 “Big Ball of Mud” (Foote and Yoder), 201 binary searches, 104–5 biological systems, 7 accretion in, 130–31 complexity of, 116–20, 122 digital technology and, 49 kluges in, 119 legacy code in, 118, 119–20 modules in, 63 tinkering in, 118 unexpected behavior in, 109–10, 123–24 biological thinking, 222 abstraction avoided in, 115–16 aesthetics and, 119 as comfortable with diversity and complexity, 113–14, 115 concept of miscellaneous in, 108–9, 140–41, 143 as detail oriented, 121, 122, 128 generalization in, 131–32 humility and, 155 physics thinking vs., 114–16, 137–38, 142–43, 222 technological complexity and, 116–49, 158, 174 Blum, Andrew, 101–2 Boeing 777, 99 Bogost, Ian, 154 Bookout, Jean, 10 Boorstin, Daniel, 89 Borges, Jorge Luis, 76–77, 131 Boston, Mass., 101, 102 branch points, 80–81 Brand, Stewart, 39–40, 126, 198–99 Brookline, Mass., 101 Brooks, David, 155 Brooks, Frederick P., Jr., 38, 59, 93 bugs, in software, see software bugs bureaucracies, growth of, 41 cabinets of curiosities (wunderkammers), 87–88, 140 calendar application, programming of, 51–53 Cambridge, Mass., 101 cancer, 126 Carew, Diana, 46 catastrophes, interactions in, 126 Challenger disaster, 9, 11, 12, 192 Chandra, Vikram, 77 Chaos Monkey, 107, 126 Chekhov, Anton, 129 Chekhov’s Gun, 129 chess, 84 Chiang, Ted, 230 clickstream, 141–42 Clock of the Long Now, The (Brand), 39–40 clouds, 147 Code of Federal Regulations, 41 cognitive processing: of language, 73–74 limitations on, 75–76, 210 nonlinear systems and, 78–79 outliers in, 76–77 working memory and, 74 see also comprehension, human collaboration, specialization and, 91–92 Commodore VIC-20 computer, 160–61 complexity, complex systems: acceptance of, see biological thinking accretion in, 36–43, 51, 62, 65, 191 aesthetics of, 148–49, 156–57 biological systems and, 116–17, 122 buoys as examples of, 14–15, 17 complication vs., 13–15 connectivity in, 14–15 debugging of, 103–4 edge cases in, 53–62, 65, 201, 205 feedback and, 79, 141–45 Gall on, 157–58, 227 hierarchies in, 27, 50–51 human interaction with, 163 infrastructure and, 100–101 inherent vs. accidental, 189 interaction in, 36, 43–51, 62, 65, 146 interconnectivity of, see interconnectivity interpreters of, 166–67, 229 kluges as inevitable in, 34–36, 62–66, 127 in legal systems, 85 and limits of human comprehension, 1–7, 13, 16–17, 66, 92–93 “losing the bubble” and, 70–71, 85 meaning of terms, 13–20 in natural world, 107–10 scientific models as means of understanding, 165–67 specialization and, 85–93 unexpected behavior in, 27, 93, 96–97, 98–99, 192 see also diversity; technological complexity complexity science, 132–38, 160 complication, complexity vs., 13–15 comprehension, human: educability of, 17–18 mystery and, 173–74 overoptimistic view of, 12–13, 152–53, 156 wonder and, 172 see also cognitive processing comprehension, human, limits of, 67, 212 complex systems and, 1–7, 13, 16–17, 66, 92–93 humility as response to, 155–56 interconnectivity and, 78–79 kluges and, 42 legal system and, 22 limitative theorems and, 175 “losing the bubble” in, 70–71, 85 Maimonides on, 152 stock market systems and, 26–27 technological complexity and, 18–29, 69–70, 80–81, 153–54, 169–70, 175–76 unexpected behavior and, 18–22, 96–97, 98 “Computational Biology” (Doyle), 222 computational linguistics, 54–57 computers, computing: complexity of, 3 evolutionary, 82–84, 213 impact on technology of, 3 see also programmers, programming; software concealed electronic complexity, 164 Congress, U.S., 34 Constitution, U.S., 33–34 construction, cost of, 48–50 Cope, David, 168–69, 229–30 corpus, in linguistics, 55–56 counting: cognitive limits on, 75 human vs. computer, 69–70, 97, 209 Cowen, Tyler, 84 Cryptonomicon (Stephenson), 128–29 “Crystalline Structure of Legal Thought, The” (Balkin), 60–61 Curiosity (Ball), 87–88 Dabbler badge, 144–45 dark code, 21–22 Darwin, Charles, 115, 221, 227 Daston, Lorraine, 140–41 data scientists, 143 datasets, massive, 81–82, 104–5, 143 debugging, 103–4 Deep Blue, 84 diffusion-limited aggregation (DLA), 134–35 digital mapping systems, 5, 49, 51 Dijkstra, Edsger, 3, 50–51, 155 “Divers Instances of Peculiarities of Nature, Both in Men and Brutes” (Fairfax), 111–12 diversity, 113–14, 115 see also complexity, complex systems DNA, see genomes Doyle, John, 222 Dreyfus, Hubert, 173 dwarfism, 120 Dyson, Freeman, on unity vs. diversity, 114 Dyson, George, 110 Economist, 41 edge cases, 53–62, 65, 116, 128, 141, 201, 205, 207 unexpected behavior and, 99–100 see also outliers Einstein, Albert, 114 Eisen, Michael, 61 email, evolution of, 32–33 emergence, in complex systems, 27 encryption software, bugs in, 97–98 Enlightenment, 23 Entanglement, Age of, 23–29, 71, 92, 96, 97, 165, 173, 175, 176 symptoms of, 100–102 Environmental Protection Agency, 41 evolution: aesthetics and, 119 of biological systems, 117–20, 122 of genomes, 118, 156 of technological complexity, 127, 137–38 evolutionary computation, 82–84, 213 exceptions, see edge cases; outliers Facebook, 98, 189 failure, cost of, 48–50 Fairfax, Nathanael, 111–12, 113, 140 fear, as response to technological complexity, 5, 7, 154–55, 156, 165 Federal Aviation Administration (FAA), Y2K bug and, 37 feedback, 14–15, 79, 135 Felsenstein, Lee, 21 Fermi, Enrico, 109 Feynman, Richard, 9, 11 field biologists, 122 for complex technologies, 123, 126, 127, 132 financial sector: interaction in, 126 interconnectivity of, 62, 64 see also stock market systems Firthian linguistics, 206 Flash Crash (2010), 25 Fleming, Alexander, 124 Flood, Mark, 61, 85 Foote, Brian, 201 Fortran, 39 fractals, 60, 61, 136 Frederick the Great, king of Prussia, 89 fruit flies, 109–10 “Funes the Memorious” (Borges), 76–77, 131 Galaga, bug in, 95–96, 97, 216–17 Gall, John, 157–58, 167, 227 game theory, 210 garden path sentences, 74–75 generalists, 93 combination of physics and biological thinking in, 142–43, 146 education of, 144, 145 explosion of knowledge and, 142–49 specialists and, 146 as T-shaped individuals, 143–44, 146 see also Renaissance man generalization, in biological thinking, 131–32 genomes, 109, 128 accretion in, 156 evolution of, 118, 156 legacy code (junk) in, 118, 119–20, 222 mutations in, 120 RNAi and, 123–24 Gibson, William, 176 Gingold, Chaim, 162–63 Girl Scouts, 144–45 glitches, see unexpected behavior Gmail, crash of, 103 Gödel, Kurt, 175 “good enough,” 27, 42, 118, 119 Goodenough, Oliver, 61, 85 Google, 32, 59, 98, 104–5 data centers of, 81–82, 103, 189 Google Docs, 32 Google Maps, 205 Google Translate, 57 GOTO command, 44–45, 81 grammar, 54, 57–58 gravitation, Newton’s law of, 113 greeblies, 130–31 Greek philosophy, 138–40, 151 Gresham College, 89 Guide of the Perplexed, The (Maimonides), 151 Haldane, J.
This phenomenon of “algorithm aversion” hints at a sentiment many of us share, which appears to be a lower-intensity version of technological fear. On the other hand, some of us veer to the opposite extreme: an undue veneration of our technology. When something is so complicated that its behavior feels magical, we end up resorting to the terminology and solemnity of religion. When we delight at Google’s brain and its anticipation of our needs and queries, when we delicately caress the newest Apple gadget, or when we visit a massive data center and it stirs something in the heart similar to stepping into a cathedral, we are tending toward this reverence. However, neither of these responses—whether from experts or laypeople—is good or productive. One leaves us with a crippling fear and the other with a worshipful awe of systems that are far from meriting unquestioning wonder. Both prevent us from confronting our technological systems as they actually are.
The Driver in the Driverless Car: How Our Technology Choices Will Create the Future by Vivek Wadhwa, Alex Salkever
23andMe, 3D printing, Airbnb, artificial general intelligence, augmented reality, autonomous vehicles, barriers to entry, Bernie Sanders, bitcoin, blockchain, clean water, correlation does not imply causation, distributed ledger, Donald Trump, double helix, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, Google bus, Hyperloop, income inequality, Internet of things, job automation, Kevin Kelly, Khan Academy, Kickstarter, Law of Accelerating Returns, license plate recognition, life extension, longitudinal study, Lyft, M-Pesa, Menlo Park, microbiome, mobile money, new economy, personalized medicine, phenotype, precision agriculture, RAND corporation, Ray Kurzweil, recommendation engine, Ronald Reagan, Second Machine Age, self-driving car, Silicon Valley, Skype, smart grid, stem cell, Stephen Hawking, Steve Wozniak, Stuxnet, supercomputer in your pocket, Tesla Model S, The Future of Employment, Thomas Davenport, Travis Kalanick, Turing test, Uber and Lyft, Uber for X, uber lyft, uranium enrichment, Watson beat the top human players on Jeopardy!, zero day
Japan may favor robots to protect its elderly and preserve its economy, but a far more contentious discussion is under way right now with tremendous implications for humanity, concerning use of robots for destructive purposes. The debate concerns whether we should allow robots powered by A.I. to kill people autonomously. More than 20,000 people signed an open letter in July 2015 that called for a worldwide ban on autonomous killing machines. A thousand of these signatories were A.I. researchers and technologists, including Elon Musk, Stephen Hawking, and Steve Wozniak.6 Their logic was simple: that once development begins of military robots enabled to autonomously kill humans, the technology will follow all technology cost and capability curves; that, in the not-so-distant future, A.I. killing machines will therefore become commodity items, easy to purchase and available to every dictator, paramilitary group, and terrorist cell. Also, of course, despotic (or even wayward democratic) governments could use these machines to control and cow their populations.
But with home health monitors, we could become as obsessed with monitoring our body’s vitals as we are with Fitbits and calorie trackers. We could become overly confident or pessimistic on the basis of data we don’t really understand but imagine we do. Over all, as you will note, I am really excited about the advances in medicine. Yes, Apple and Google, both developing medical devices and healthcare applications, may want my health data in order to present me with more highly targeted ads. But their motivation is to keep me healthy, to prevent disease, so that I can do more searches and download more applications. The motivation of the healthcare industry has been to keep me coming back for more. So, Apple, take my data and send me your ads, but please help me keep healthy. PART THREE What Are the Risks and the Rewards? 8 Robotics and Biology: The Inevitable Merging of Man and Machine As a child, I believed that by the time I grew up, we would all have robots like Rosie, from The Jetsons, cleaning up after us.
Office of Personnel Management (undated), https://www.opm.gov/cybersecurity/cybersecurity-incidents (accessed 21 October 2016). 4. Casey Newton, “The mind-bending messiness of the Ashley Madison data dump,” the Verge 19 August 2015, http://www.theverge.com/2015/8/19/9178855/ashley-madison-data-breach-implications (accessed 21 October 2016). 5. Mat Honan, “How Apple and Amazon security flaws led to my epic hacking,” WIRED 6 August 2012, https://www.wired.com/2012/08/apple-amazon-mat-honan-hacking (accessed 21 October 2016). 6. Kevin Kelley, The Inevitable, Viking: New York, 2016. CHAPTER TEN 1. Jonathan Vanian, “7-Eleven Just Used a Drone to Deliver a Chicken Sandwich and Slurpees,” Fortune 22 July 2016, http://fortune.com/2016/07/22/7-eleven-drone-flirtey-slurpee (accessed 21 October 2016). 2. Mary Meeker, “Internet Trends 2015—Code Conference,” Kleiner Perkins Caulfield & Byers, http://www.kpcb.com/blog/2015-internet-trends. 3.
Barefoot Into Cyberspace: Adventures in Search of Techno-Utopia by Becky Hogge, Damien Morris, Christopher Scally
A Declaration of the Independence of Cyberspace, back-to-the-land, Berlin Wall, Buckminster Fuller, Chelsea Manning, citizen journalism, cloud computing, corporate social responsibility, disintermediation, Douglas Engelbart, Douglas Engelbart, Electric Kool-Aid Acid Test, Fall of the Berlin Wall, game design, Hacker Ethic, informal economy, information asymmetry, Jacob Appelbaum, jimmy wales, John Markoff, Julian Assange, Kevin Kelly, mass immigration, Menlo Park, Mitch Kapor, MITM: man-in-the-middle, moral panic, Mother of all demos, Naomi Klein, Nelson Mandela, Network effects, New Journalism, Norbert Wiener, peer-to-peer, Richard Stallman, Silicon Valley, Skype, Socratic dialogue, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Telecommunications Act of 1996, The Hackers Conference, Vannevar Bush, Whole Earth Catalog, Whole Earth Review, WikiLeaks
Not only was he a veteran of the early non-violent protest movement, now he was into the promise of personal computers in a big way. Most of the money Moore received the night of Stewart Brand’s Demise Party eventually went into founding the Homebrew Computer Club, a place for amateur and professional computer enthusiasts to tinker with personal computers. And so from the ashes of the Catalog rose a legendary phoenix. For it was at Homebrew that Steve Wozniak would meet Steve Jobs, and the two decide to found Apple Computers. Lee Felsenstein, designer of the first mass-produced portable computer the Osborne 1, also hung out there, as did the legendary telephone network hacker or “phreak” John “Captain Crunch” Draper. And it was to the Homebrew Computer Club that a 20-year old Bill Gates wrote his disgruntled “Open letter to Hobbyists”. * * * After the Catalog’s demise party, Brand reached out to a wider audience.
I’m here to interview Rop Gonggrijp, a Dutch hacker and activist, and a long-time friend of the CCC. And yes, I may have dropped his name at the desk to secure our entry, after all, at one time he was the Netherlands’ most notorious hacker. But I’m sure he wouldn’t mind. He’s accustomed to the so-called “social hack”. In 2006, Rop and his friends obtained decommissioned voting computers from a Dutch local authority in exchange for apple cake. He then proceeded to demonstrate on national television how the machines could be hacked to reveal and even alter what unsuspecting Dutch voters were keying in at the polling station. Unsurprisingly, the Dutch don’t use electronic voting anymore in their elections, they vote with pencil and paper. This is how I know Rop. When I was running the Open Rights Group we also campaigned to eliminate electronic voting from UK elections.
Housing was super cheap, welfare and tenancy law was on your side, and the city was – and still is – full of pirate nightclubs, galleries and art spaces. This couple both study history of art: he’s about to get his PhD; hers has won her a permanent fellowship at Berlin’s Humboldt University. He wears pointy shoes, she wears vintage dresses bought by weight in downtown East Berlin. They have a Siamese cat called Raoul. Neither of them care about computers, although they both have Apple Mac laptops for work. Their names are Sarah and Luke. The one problem with Berlin, they tell me, is the Germans. They’re all so bloody rude. To prove their point, Sarah and Luke have brought me to what seems like the most poorly-served bar in Berlin. When the waitress does eventually arrive at our table, it seems she has only done so in order to let us know that tonight she shouldn’t really be working at all – the boss has brought her in because he got the rota wrong.
Age of Context: Mobile, Sensors, Data and the Future of Privacy by Robert Scoble, Shel Israel
Albert Einstein, Apple II, augmented reality, call centre, Chelsea Manning, cloud computing, connected car, Edward Snowden, Edward Thorp, Elon Musk, factory automation, Filter Bubble, G4S, Google Earth, Google Glasses, Internet of things, job automation, John Markoff, Kickstarter, lifelogging, Marc Andreessen, Mars Rover, Menlo Park, Metcalfe’s law, New Urbanism, PageRank, pattern recognition, RFID, ride hailing / ride sharing, Robert Metcalfe, Saturday Night Live, self-driving car, sensor fusion, Silicon Valley, Skype, smart grid, social graph, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Tesla Model S, Tim Cook: Apple, ubercab, urban planning, Zipcar
Some years will pass before people look back and try to understand how they ever could have lived without such a device. Scoble tells audiences it’s like seeing the first Apple IIs as they rolled off the assembly line in 1977: They were like nothing people had seen before, but you couldn’t do much with them. Decision makers at HP and Atari weren’t interested in cutting a deal with Steve Wozniak and Steve Jobs for rights to market their new computer—the new, highly personalized devices were obviously too radically different to sell in significant quantity. Yet, it turned out a lot of people wanted them and the Apple II kicked off a 20-year explosion of invention and productivity that we now remember as the PC revolution. Google Glass will do the same. How long will it take? We’re not sure.
Some pointed to a bitter and public divorce between Apple and Google. Steve Jobs had considered Google Android to be a direct rip-off of Apple’s iOS operating system. Could Apple Maps have simply been a crudely devised and poorly executed act of revenge against a powerful former ally? We think not. In our view, Apple made a huge mistake, but it was strategically motivated and not part of a petty Silicon Valley vendetta. Although Google and Apple historically had lots of good reasons to be allies, they were destined to become the rivals they now are. In the past, tech companies were pretty much divided between hardware and software, so an alliance between world leaders in each of the two categories was formidable, to say the least. Apple was clearly the pacesetter in world-changing mobile hardware.
To remain a leader, Apple and Google each needed to vie for online time, for alliances with third-party developers and to provide platforms that make those apps valuable. For Google that meant having its own operating system; for Apple it meant having maps because it saw the unquestionable value of location-based services. For Apple, and many companies, mobile apps are the secret sauce of the Age of Context; mobile mapping is the most strategic of all categories. Caterina Fake, CEO and founder of Findery, a location-based platform, explains it best in a statement that is simultaneously obvious and profound: “Without location, there is no context.” And for Apple, without context there will be no leadership. So Apple and Google divorced. Today Android and iOS compete for mobile operating system dominance, and thus Apple had little choice but to develop its own maps. Its big mistake was not in the play, but in being unprepared for the enormous challenges they faced on an unrealistically short timeline and then blindly plowing forward.
Little Bets: How Breakthrough Ideas Emerge From Small Discoveries by Peter Sims
Amazon Web Services, Black Swan, Clayton Christensen, complexity theory, David Heinemeier Hansson, deliberate practice, discovery of penicillin, endowment effect, fear of failure, Frank Gehry, Guggenheim Bilbao, Jeff Bezos, knowledge economy, lateral thinking, Lean Startup, longitudinal study, loss aversion, meta analysis, meta-analysis, PageRank, Richard Florida, Richard Thaler, Ruby on Rails, Silicon Valley, statistical model, Steve Ballmer, Steve Jobs, Steve Wozniak, theory of mind, Toyota Production System, urban planning, Wall-E
It began when Jobs dropped out of Reed College during his first year, but stuck around campus and decided to take a class in calligraphy. “It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating,” Jobs recalled in a commencement speech at Stanford University. Jobs never expected the experience to have practical applications, but it did ten years later when Jobs and Steve Wozniak were developing Apple’s first Macintosh computer. “It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would never have had multiple typefaces or proportionally spaced fonts.” Some investors have achieved significant advantages by embracing the value of immersion. Most investors sit all day in offices in London, New York, or Boston. In contrast, James Chanos, who runs Kynikos Associates and famously predicted Enron’s and Tyco’s fall, regularly sends analysts to attend industry trade shows to speak with sales reps working on the front lines to get an up-to-date pulse on market activities.
The surprising story of Pixar’s development from a struggling startup without a viable business plan into one of the most successful movie makers ever beautifully illustrates both the value of building means as well as an affordable losses mentality. Pixar was a computer hardware company when Steve Jobs bought it in 1986. Before purchasing Pixar, Jobs had been forced out of Apple in 1985 by his hand-picked CEO successor, John Scully, following frequent clashes. Scully wanted Jobs to focus exclusively on products, while Jobs wanted to take Apple back over from Scully. After Scully caught wind of an attempted coup by Jobs when Scully was on a trip to Asia, he stripped Jobs of his responsibilities. Jobs then left Apple, bought Pixar, and started another computer company, called Next Computer. Both Pixar and Next struggled, and the open question was whether Steve Jobs was just another one-hit wonder. Jobs had had a clear initial vision for Next: to provide computer workstations to the education market.
This book is a superb introduction to her research as well as an aid to help change one’s own mind-set. Kahney, Leander. Inside Steve’s Brain. New York: Portfolio, 2009. Kahney, a contributing editor at Wired magazine, has done the best job I’ve seen of writing about Steve Jobs and Apple, a company that is notoriously secretive. Kahney’s book provides the reader with a fairly detailed feel for how Jobs works and leads, as well as the best description of Apple’s design process that I’ve seen. The book was bolstered by Kahney’s ability to interview Apple’s chief designer, Jonathan Ive, among other insiders. Lamott, Anne. Bird by Bird: Some Instructions on Writing and Life. Garden City, NY: Anchor, 1995. Anne Lamott is primarily a prominent American novelist, but this is a nonfiction book in which Lamott describes her writing tactics, especially how to overcome the common fears and barriers writers face.
The Fire Starter Sessions: A Soulful + Practical Guide to Creating Success on Your Own Terms by Danielle Laporte
affirmative action, Albert Einstein, David Heinemeier Hansson, delayed gratification, Frank Gehry, index card, invisible hand, Lao Tzu, pattern recognition, Ralph Waldo Emerson, Steve Jobs, Steve Wozniak
Spending other people’s money is addictive. It’s usually a bad deal. Customers move down the totem pole. Raising money is incredibly distracting. When raising money is the right thing to do, get mentors, start sending your banker season tickets, prepare to be out of the office at least 30 percent of the time. Take your vitamins. All the best things I did at Apple came from (a) not having any money, and (b) not having done it before, ever. —Steve Wozniak, cofounder of Apple HOW MONEY FEELS THERE’S ONLY ONE TIME THAT DOING IT FOR THE MONEY WORKS.… And that’s when you have a light at the end of the tunnel and an unwavering commitment to yourself to transition into doing work that makes you happier, or selling something that you’re 100 percent proud of. At any given time you could be juggling a “soul job” and a “ho job.”
But what seems like avoidance may be a deeper inkling of wrong timing. Hesitation can be a form of wisdom. Motives become clearer; new information shows up. Amazing grace can happen when you choose inner rhythms over external pressure. ENTHUSIASM SAVES LIVES You know what’s easy? Dreaming. Hanging out with people who make you feel good. Laughing. Resting. Being passionate. Sometimes you have to shovel horse apples to make your dreams come true. But, ultimately, no dream will serve you if you’re forcing yourself to make it happen. Enthusiasm is a fantastic indicator of where your true strength lives. It’s the immediate “I love it!” response, the game you’ve got to get into, the cause you can’t walk away from, the idea that makes you pause and then nod, “Oh, this is a good one, a really good one.” Enthusiasm evokes a determined “no matter what-ness.”
Be daring enough to tell us—your customers, your fans, your people—about your ambitions because we’ll be the ones to help you fulfill them. Expect to offend some people. If you’re not having some polarizing effects, then you’re not fully showing up. ONLY THE SELF-REFERENCING THRIVE The starting point is Who am I? not What will sell? Your foundation has to be built on your real passion. The rest of branding is about accurate packaging. CONSISTENCY WINS Keep on being yourself, relentlessly. We can count on Apple to innovate. We know that Ford trucks will always be tough. We could rely on Dr. Martin Luther King, Jr., to deliver his vision with strength. Being consistent doesn’t mean you don’t change or evolve. Look at Madonna. She’s consistently reinventing herself. Reinvention is her brand. When you consistently and genuinely show up, you build trust. PRIDE IS POSITIVE Rejoice and love yourself today ’Cause baby, you were born this way.
Company: A Short History of a Revolutionary Idea by John Micklethwait, Adrian Wooldridge
affirmative action, barriers to entry, Bonfire of the Vanities, borderless world, business process, Charles Lindbergh, Corn Laws, corporate governance, corporate raider, corporate social responsibility, creative destruction, credit crunch, crony capitalism, double entry bookkeeping, Etonian, hiring and firing, industrial cluster, invisible hand, James Watt: steam engine, joint-stock company, joint-stock limited liability company, Joseph Schumpeter, knowledge economy, knowledge worker, laissez-faire capitalism, manufacturing employment, market bubble, mittelstand, new economy, North Sea oil, race to the bottom, railway mania, Ronald Coase, Silicon Valley, six sigma, South Sea Bubble, Steve Jobs, Steve Wozniak, strikebreaker, The Nature of the Firm, The Wealth of Nations by Adam Smith, Thorstein Veblen, trade route, transaction costs, tulip mania, wage slave, William Shockley: the traitorous eight
Packard served as deputy secretary of defense in the first Nixon administration. In the 1970s, the Valley began to acquire its identity. The name “Silicon Valley” was invented in 1971 by a local technology journalist—reflecting the success of its memory-chip makers. Meanwhile, the Valley began to be taken over by the sort of people who protested against the Vietnam War, rather than helped run it. In 1976, Steve Jobs and Steve Wozniak set up Apple Computer in the Jobs family garage. But the 1970s boom was brought to a halt by the Japanese. On “the black day,” March 28, 1980, Richard Anderson, a HP manager, revealed that tests had shown the Japanese memory chips outperformed the Valley’s. To its shame, the Valley turned to the American government for protection, but it also successfully changed shape, outsourcing its manufacturing and diversifying from chips into computer software.
It built museums and art galleries in a country that was prone to philistinism. And it bound the classes together in a society where the income gap was widening. The third and most important thing that provided a bedrock of support for the company came down to a simple proposition: The company was making America richer. In his essay “Why Is There No Socialism in the United States?,” Werner Sombart, a German sociologist, argued that “on the reefs of roast beef and apple pie socialist utopias of every sort are sent to their doom.” The new companies plainly improved the living standards of millions of ordinary people, putting the luxuries of the rich within the reach of the man in the street. When Henry Ford went into the car business, it was devoted to handcrafting toys for the super-rich; by 1917, he had sold 1.5 million Model T’s. When George Eastman purchased his first camera in November 1877, it cost him $49.58, and was so difficult to use that he had to pay $5 for lessons.
(Nobody was particularly surprised when a survey showed that 82 percent of chief executives admitted to cheating at golf.)34 Meanwhile, investors fumed when they discovered that Wall Street analysts had been misleading them with Orwellian doublespeak: to the cognoscenti, a “buy” recommendation meant “hold” and “hold” meant “run like hell.” What had gone wrong? Two explanations emerged. The first, to which the Bush administration initially subscribed, might be described as the “bad apples” school: the scandals were the product of individual greed, not a flawed system. The bankruptcies and the arrests would be enough: the founder of Adelphia, John Rigas, was forced to do a “perp walk,” clamped into handcuffs and paraded in front of the cameras. By contrast, those of the “rotten roots” school argued that the problems went much deeper. They argued that the 1990s had seen a dramatic weakening of proper checks and balances.
How PowerPoint Makes You Stupid by Franck Frommer
Albert Einstein, business continuity plan, cuban missile crisis, dematerialisation, hypertext link, invention of writing, inventory management, invisible hand, Just-in-time delivery, knowledge worker, Marshall McLuhan, means of production, new economy, oil shock, Ronald Reagan, Silicon Valley, Steve Jobs, Steve Wozniak, union organizing
And some champion presenters have grasped this fact. PERFORMANCE BUSINESS Over the course of a few years, the late head of Apple became a past master in the realm of spectacle, to the point that entire books have been written about his talents as a presenter. Steve Jobs’s shows began on January 24, 1984, for the release of the first Macintosh. It is touching today to see the young man in a black suit with a bow tie, a little ironic smile at the corner of his mouth as though he’d just made a bad joke, presenting the machine that revolutionized the computer market. Obviously, presentations at the time did not yet use dedicated software. But the procedure was already well honed. Even though Jobs, just out of his garage with Steve Wozniak, read his notes, spoke too fast, and stood awkwardly, he made a few jokes, manipulated projected photographs, and played with numbers.
The presentation follows a well-tried plan: brief slide on earnings—better to be quick; they’re bad—then Steve Jobs, armed with his remote, the essential tool of Steve Notes, briefly recounts his time at Pixar, then follows with three quotations that appear on the screen. “Apple has become irrelevant.” “Apple can’t execute anything.”18 “Apple’s culture is anarchy; you can’t manage it.” Jobs’s entire presentation is based on these three negative judgments. The procedure is clever; it enables him to build, practically in real time, Apple’s new strategy on the basis of criticisms made of it and to bring out the value of the innovative products and services intended to contradict these received ideas. Jobs’s first trick is to rely on the audience’s taste for numbers. The use of a number as image or emblem is an indication of his way of using a mere item of management information as an element of communication: quantity thus becomes a sign of quality.19 The number of units sold, hard-drive capacity, prices, market share—all become icons of Apple’s success over the years, powerful elements of memory.
The computer is the star of the presentation. The lights go out, mysterious music is heard—Tubular Bells by Mike Oldfield15—and all the machine’s features start to parade on the screen, crowning the show. This type of exhibition symbolizes one of Apple’s trademarks. Every year since 1984, the manufacturer has organized a large conference to which it invites the “Apple family.” The California company understood early the marketing value of creating a community of users, as opposed to mere consumers. The most famous of these gatherings is the Macworld Conference Expo that takes place in January every year. Apple presents its earnings, its plans, and its new software and hardware. The “Mac family” waits like impatient fans for Steve Notes, the boss’s presentation.16 Over the years, these annual meetings have become highly codified ceremonies, the staging of which Steve Jobs has brought to a high polish.
Why Information Grows: The Evolution of Order, From Atoms to Economies by Cesar Hidalgo
"Robert Solow", Ada Lovelace, Albert Einstein, Arthur Eddington, assortative mating, business cycle, Claude Shannon: information theory, David Ricardo: comparative advantage, Douglas Hofstadter, Everything should be made as simple as possible, frictionless, frictionless market, George Akerlof, Gödel, Escher, Bach, income inequality, income per capita, industrial cluster, information asymmetry, invention of the telegraph, invisible hand, Isaac Newton, James Watt: steam engine, Jane Jacobs, job satisfaction, John von Neumann, Joi Ito, New Economic Geography, Norbert Wiener, p-value, Paul Samuelson, phenotype, price mechanism, Richard Florida, Ronald Coase, Rubik’s Cube, Silicon Valley, Simon Kuznets, Skype, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, The Market for Lemons, The Nature of the Firm, The Wealth of Nations by Adam Smith, total factor productivity, transaction costs, working-age population
Silicon Valley’s knowledge and knowhow are not contained in a collection of perennially unemployed experts but rather in the experts working in firms that participate in the design and development of software and hardware. In fact, the histories of most firms in Silicon Valley are highly interwoven. Steve Jobs worked at Atari and Steve Wozniak worked at HP before starting Apple. As mentioned previously, Steve Jobs is also famously known for “borrowing” the ideas of a graphical user interface and object-oriented programming from Xerox PARC. If HP, Atari, and Xerox PARC had not been located in the valley, it is likely that the knowledge and knowhow needed to get Apple started would not have been there, either. Hence, industries that require subsets of the knowledge and knowhow needed in other industries represent essential stepping-stones in the process of industrial diversification. The personbyte theory can also help us explain why large chunks of knowledge and knowhow are hard to accumulate and transfer, and why knowledge and knowhow are organized in the hierarchical pattern that is expressed in the nestedness of the industry-location data.
PIRSIG Consider two types of apples: those that grow on trees and you buy at the supermarket, and those that are designed in Silicon Valley. Both are traded in the economy, and both embody information, whether in biological cells or silicon chips. The main difference between them is not their number of parts or their ability to perform functions—edible apples are the result of tens of thousands of genes that perform sophisticated biochemical functions. The main difference between apples and Apples is that the apples we eat existed first in the world and then in our heads, while the Apples we use to check our email existed first in someone’s head and then in the world. Both of these apples are products and embody information, but only one of them—the silicon Apple—is a crystal of imagination.1 Thinking about products as crystals of imagination tells us that products do not just embody information but also imagination.
Both of these apples are products and embody information, but only one of them—the silicon Apple—is a crystal of imagination.1 Thinking about products as crystals of imagination tells us that products do not just embody information but also imagination. This is information that we have generated through mental computations and then disembodied by creating an object that mimics the one we had in our head. Edible apples existed before we had a name for them, a price for them, or a market for them. They were present in the world. As a concept, apples were simply imported into our minds. On the other hand, iPhones and iPads are mental exports rather than imports, since they are products that were begotten in our minds before they became part of our world. So the main difference between apples and Apples resides in the source of their physical order rather than in their embodiment of physical order. Both products are packets of information, but only one of them is a crystal of imagination. In this chapter I will emphasize the imaginary origin of the information embodied in products, as this is a fundamental characteristic of the type of information that humans grow and accumulate.
The Secret War Between Downloading and Uploading: Tales of the Computer as Culture Machine by Peter Lunenfeld
Albert Einstein, Andrew Keen, anti-globalists, Apple II, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, Burning Man, business cycle, butterfly effect, computer age, creative destruction, crowdsourcing, cuban missile crisis, Dissolution of the Soviet Union, don't be evil, Douglas Engelbart, Douglas Engelbart, Dynabook, East Village, Edward Lorenz: Chaos theory, Fall of the Berlin Wall, Francis Fukuyama: the end of history, Frank Gehry, Grace Hopper, gravity well, Guggenheim Bilbao, Honoré de Balzac, Howard Rheingold, invention of movable type, Isaac Newton, Jacquard loom, Jane Jacobs, Jeff Bezos, John Markoff, John von Neumann, Kickstarter, Mark Zuckerberg, Marshall McLuhan, Mercator projection, Metcalfe’s law, Mother of all demos, mutually assured destruction, Nelson Mandela, Network effects, new economy, Norbert Wiener, PageRank, pattern recognition, peer-to-peer, planetary scale, plutocrats, Plutocrats, post-materialism, Potemkin village, RFID, Richard Feynman, Richard Stallman, Robert Metcalfe, Robert X Cringely, Schrödinger's Cat, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, Skype, social software, spaced repetition, Steve Ballmer, Steve Jobs, Steve Wozniak, Ted Nelson, the built environment, The Death and Life of Great American Cities, the medium is the message, Thomas L Friedman, Turing machine, Turing test, urban planning, urban renewal, Vannevar Bush, walkable city, Watson beat the top human players on Jeopardy!, William Shockley: the traitorous eight
Jobs and Gates started out when personal computing, that idea advanced by Licklider the Patriarch and Kay the Aquarian, was the province of a tiny group of obsessed hobbyists. It was a business, but one with a smaller market than ﬂy-ﬁshing. As teenagers in the 1970s, Jobs and Gates were part of this small group of hobbyists who purchased kits to make simple, programmable computers to use (and play with) at home. Jobs, along with Steve Wozniak, were members of the best-known group of these enthusiasts, the Homebrew Computer Club of Cupertino, California. Gates, who had been programming since he found himself able to get access to a DEC mainframe in high school, was already writing software professionally while he was a student at Harvard. Jobs and Gates, along with their collaborators and competitors in the mid-1970s, were positioned at a fulcrum point, when a diversion turned into a business.
What made them both rich and powerful was their ability to meld the attributes of the two generations that preceded them—fusing the hardheaded business logic of the Plutocrats with the visionary futurity of the Aquarians. 163 GENERATIONS Jobs and Gates have an interesting competitive history, leapfrogging each other in the quest to achieve “insane greatness,” in Jobs’s words, and global market preeminence, for Gates.21 Jobs and his partner, Wozniak, were the ﬁrst to make the leap from hobbyists to industrialists with their Apple computers, launched in 1976. It was the Apple II that really broke loose, in 1977, attracting a huge user base, and establishing Jobs and Wozniak as the ﬁrst publicly lauded millionaire whiz kids of Silicon Valley. As important as their early success with the Apple II was, however, their greatest impact came seven years later, when they took the inspiration of people like Engelbart and Kay, and created a mass-market personal computer that set a new standard for participation. Before we get to that, we need to return to 1976, and move from Silicon Valley to New Mexico, where Gates and his partners, including former Harvard friends Paul Allen and Steve Ballmer, were writing programs for the Altair computer.
Steve Jobs, the cofounder and CEO of Apple, on the other hand, has been known to create a “reality distortion effect” around himself because of the intensity of his vision for computing. He worked for early electronic games pioneer Atari in the late 1970s and visited Xerox PARC, where he saw the work infused with Engelbart and Kay’s Aquarian vision. This spirit resonated with Jobs, who at one point had taken a personal pilgrimage to India and lived in an ashram. But even more so, the meme of participation entered his head on those visits to PARC. The Apple II, released in 1977, was unique in having a graphics capability and a soundboard built in. Here was the ﬁrst major computer for the masses, designed from the start as a multimedia machine. These Apple IIs became the de facto machines in classrooms around the country, and without a doubt prepared a generation of computer users for what was to come.
Making Ideas Happen: Overcoming the Obstacles Between Vision and Reality by Scott Belsky
centralized clearinghouse, index card, lone genius, market bubble, Merlin Mann, New Journalism, Results Only Work Environment, rolodex, side project, Silicon Valley, Steve Jobs, Steve Wozniak, supply-chain management, Tim Cook: Apple, Tony Hsieh, young professional
Approximately nine hundred people have traveled here from around the world for the annual TED conference. Leaders in the worlds of technology, entertainment, and design have come for a curated set of eighteen-minute presentations on new ideas and breakthroughs across industries. They have also made the trek to meet each other during the breaks and dinners that happen over the course of the five-day conference. The audience is star-studded. From tech legends like Bill Gates, Steve Wozniak, and Google founders Larry Page and Sergey Brin, to entertainment icons like Robin Williams and Ben Affleck, everyone has come to indulge themselves with a healthy dose of wonderment. TED’s tagline is “ideas worth spreading.” As chief curator Chris Anderson (not to be confused with Wired ’s Chris Anderson) explains, the purpose is “to put great people on the TED stage and let the rest happen as it will.”
You don’t need to set aside three actual rooms, but you do need a period of scrutiny in your creative process. You also don’t want to create too much structure around when you can and cannot generate new ideas. However, you must be willing to kill ideas liberally—for the sake of fully pursuing others. In a rare interview in BusinessWeek on Apple’s system for innovation, CEO Steve Jobs explained that, in fact, there is no system at Apple—and that spontaneity is a crucial element for innovation, so long as it is paired with the ability to say no without hesitation: Apple is a very disciplined company, and we have great processes. But that’s not what it’s about. Process makes you more efficient. But innovation comes from people meeting up in the hallways or calling each other at 10:30 at night with a new idea, or because they realized something that shoots holes in how we’ve been thinking about a problem.
At the same time, many of us don’t really associate such tasks with creativity and ideas. Since 2004, AMR Research, a leading authority on supply chain research that serves numerous Fortune 500 companies, has published an annual list of the twenty-five companies with the best supply chain management. You might be surprised to learn that Apple debuted on the list at No. 2 in 2007, and overtook companies such as Anheuser-Busch, Wal-Mart, Procter & Gamble, and Toyota to take the No. 1 slot in 2008. Why would Apple, a company known for new ideas and its ability to “think different,” also be one of the most organized companies on the planet? The answer is that—like it or not—organization is a major force for making ideas happen. Organization is just as important as ideas when it comes to making an impact. Consider the following equation: CREATIVITY X ORGANIZATION = IMPACT If the impact of our ideas is, in fact, largely determined by our ability to stay organized, then we would observe that those with tons of creativity but little to no organization yield, on average, nothing.
Geek Sublime: The Beauty of Code, the Code of Beauty by Vikram Chandra
Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple II, barriers to entry, Berlin Wall, British Empire, business process, conceptual framework, create, read, update, delete, crowdsourcing, don't repeat yourself, Donald Knuth, East Village, European colonialism, finite state, Firefox, Flash crash, glass ceiling, Grace Hopper, haute couture, iterative process, Jaron Lanier, John von Neumann, land reform, London Whale, Norman Mailer, Paul Graham, pink-collar, revision control, Silicon Valley, Silicon Valley ideology, Skype, Steve Jobs, Steve Wozniak, supercomputer in your pocket, theory of mind, Therac-25, Turing machine, wikimedia commons, women in the workforce
I didn’t feel comfortable hacking up the code of a Real Programmer.12 Despite the allusion above to “the *macho* side of programming,” the non-geek may not fully grasp that within the culture of programmers, Mel es muy macho. The Real Programmer squints his eyes, does his work, and rides into the horizon to the whistling notes of Ennio Morricone. To you, Steve Wozniak may be that cuddly penguin who was on a few episodes of Dancing with the Stars, and by all accounts, he really is the good, generous man one sees in interviews. But within the imaginations of programmers, Woz is also a hard man, an Original Gangsta: he wired together his television set and a keyboard and a bunch of chips on a circuit board and so created the Apple I computer. Then he realized he needed a programming language for the microprocessor he’d used, and none existed, so Woz—who had never taken a language-design class—read a couple of books, wrote a compiler, and then wrote a programming language called Integer BASIC in machine code.
And when we say “wrote” this programming language we mean that he wrote the assembly code in a paper notebook on the right side of the pages, and then transcribed it into machine code on the left.13 And he did all this while holding down a full-time job at Hewlett-Packard: “I designed two computers and cassette tape interfaces and printer interfaces and serial ports and I wrote a Basic and all this application software, I wrote demos, and I did all this moonlighting, all in a year.”14 That second computer was the Apple II, the machine that defined personal computing, that is on every list of the greatest computers ever made. Woz designed all the hardware and all the circuit boards and all the software that went into the Apple II, while the other Steve spewed marketing talk at potential investors and customers on the phone. Every piece and bit and byte of that computer was done by Woz, and not one bug has ever been found, “not one bug in the hardware, not one bug in the software.”15 The circuit design of the Apple II is widely considered to be astonishingly beautiful, as close to perfection as one can get in engineering. Woz did both hardware and software. Woz created a programming language in machine code.
In Beautiful Code: Leading Programmers Explain How They Think, edited by Andy Oram and Greg Wilson, loc. 13832–6127. Sebastopol, CA: O’Reilly Media, 2007. Kindle edition. Matthews, Peter Hugoe. The Concise Oxford Dictionary of Linguistics. 2nd ed. Oxford: Oxford University Press, 2007. Matyszczyk, Chris. “Woz: Microsoft Might Be More Creative Than Apple.” Technically Incorrect—CNET News, November 15, 2012. http://news.cnet.com/8301-17852_3-57550839-71/woz-microsoft-might-be-more-creative-than-apple/. McCrea, Lawrence J. The Teleology of Poetics in Medieval Kashmir. Cambridge, MA: Harvard University, Department of Sanskrit and Indian Studies, 2008. McPherson, Amanda, Brian Proffitt, and Ron Hale-Evans. “Estimating the Total Development Cost of a Linux Distribution.” Linuxfoundation.org, September 2008. http://www.linuxfoundation.org/sites/main/files/publications/estimatinglinux.html.
The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age by Paul J. Nahin
Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Claude Shannon: information theory, conceptual framework, Edward Thorp, Fellow of the Royal Society, finite state, four colour theorem, Georg Cantor, Grace Hopper, Isaac Newton, John von Neumann, knapsack problem, New Journalism, Pierre-Simon Laplace, reversible computing, Richard Feynman, Schrödinger's Cat, Steve Jobs, Steve Wozniak, thinkpad, Thomas Bayes, Turing machine, Turing test, V2 rocket
A diagram like Figure 7.1.2 is very nice for a high-level, slide-show management meeting (I call it a Jobs-diagram, in honor of Apple’s late marketing genius Steve Jobs, who sold a good line, but who I suspect might have been more than just a little vague on what is actually inside an Apple computer or iPad). For engineers who are tasked with building real hardware, however, it really won’t do. What we need to do now is show precisely how to build both the parity bit generator logic at the source end of the channel, and the parity bit checking logic at the receiver end of the channel. What we are aiming for is a Wozniak-diagram (in honor of Apple’s Steve Wozniak, the technical brains behind the original Apple computer). 7.2 THE EXCLUSIVE-OR GATE (XOR) To lay the groundwork for parity logic, this section introduces a “new” logic gate, the exclusive-OR (written as XOR).
And yet, as we proceed through the book, I’ll show you how they too will easily yield to routine Boolean algebraic analysis.3 PUZZLE 2 The local truant officer has six boys under suspicion for stealing apples. He knows that only two are actually guilty (but not which two), and so he questions each boy individually. (a) Harry said, “Charlie and George did it.” (b) James said, “Donald and Tom did it.” (c) Donald said, “Tom and Charlie did it.” (d) George said, “Harry and Charlie did it.” (e) Charlie said, “Donald and James did it.” (f) Tom couldn’t be found and didn’t say anything. (g) Of the five boys interrogated, four of them each correctly named one of the guilty. (h) The remaining boy lied about both of the names he gave. Who stole the apples? PUZZLE 3 Alice, Brenda, Cissie, and Doreen competed for a scholarship. “What luck have you had?” someone asked them.
Demystifying Smart Cities by Anders Lisdorf
3D printing, artificial general intelligence, autonomous vehicles, bitcoin, business intelligence, business process, chief data officer, clean water, cloud computing, computer vision, continuous integration, crowdsourcing, data is the new oil, digital twin, distributed ledger, don't be evil, Elon Musk, en.wikipedia.org, facts on the ground, Google Glasses, income inequality, Infrastructure as a Service, Internet of things, Masdar, microservices, Minecraft, platform as a service, ransomware, RFID, ride hailing / ride sharing, risk tolerance, self-driving car, smart cities, smart meter, software as a service, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Stuxnet, Thomas Bayes, Turing test, urban sprawl, zero-sum game
In order to understand this type, you have to think about a typical civil engineer who is given the task to design a bridge. He does not need to know or be interested in any way about why or what the potential of the bridge is, nor does he need to plan or carry out the actual work. He needs to find a way to do something very specific and nontrivial based on his knowledge of technology and environment. A prime example is Steve Wozniak who designed the Apple I and II according to the vision of Steve Jobs. Not a lot of people of this type are well known in public because they usually do their job without any great publicity or acclaim. Engineer types are often lead developers or solution architects with responsibility for a technical product or solution. They are frequently found as presales engineers of vendors. The engineer needs to be engaged later in the change process when visions and masterplans have been developed.
Creating a special innovation team is a bad way since this isolates the innovative behavior from influencing the way the organization as a whole works. We should therefore look at subtle ways of changing the existing routines. This is also why most diets are unsuccessful: the amount of change is too massive and will feel alien. We are therefore looking at minimal changes similar to substituting our delicious donut with an apple. Examples could be to let employees spend time on innovative projects that would usually be considered a waste of time or circulate them through an innovation lab. Reward – For the habit-forming process, this is the most important point. It is not enough to have a ceremony after an innovation challenge and issue press releases. The reward needs to come consistently and as a product of the routine described previously.
In this way, politicians can start by looking at available policy tools to start a dynamic to radically more self-sufficient and sustainable cities. Self-sustaining cities are also vital for the next frontier of human civilization: space. Within a hundred years, the first budding cities will begin to appear in our solar system – presumably the moon and Mars first, but other targets such as the moons of Jupiter and Saturn are other good candidates for future cities. These cities will not have nearby farms where you can go to buy apples or distribution centers from where you can get a new computer delivered next day through a carrier service. Virtually everything needs to be produced and recycled within the context of the city. In the future cities in space, we have to be able to produce our own food. Not only do we need hydroponic farms but also advanced gene editing solutions. We can’t bring the seeds of all the crop we want, and we can’t foresee what traits are needed to be strengthened in the new surroundings.
Ctrl Alt Delete: Reboot Your Business. Reboot Your Life. Your Future Depends on It. by Mitch Joel
3D printing, Amazon Web Services, augmented reality, call centre, clockwatching, cloud computing, Firefox, future of work, ghettoisation, Google Chrome, Google Glasses, Google Hangouts, Khan Academy, Kickstarter, Kodak vs Instagram, Lean Startup, Marc Andreessen, Mark Zuckerberg, Network effects, new economy, Occupy movement, place-making, prediction markets, pre–internet, QR code, recommendation engine, Richard Florida, risk tolerance, self-driving car, Silicon Valley, Silicon Valley startup, Skype, social graph, social web, Steve Jobs, Steve Wozniak, Thomas L Friedman, Tim Cook: Apple, Tony Hsieh, white picket fence, WikiLeaks, zero-sum game
What can only be described as a contemporary subculture, this annual event showcases “makers”—people who create robotics, electronics, woodworking, 3D printing, and more. These hobbyists embody the next generation of the same philosophical ideologies that brought together people interested in computers and computing back in the 1970s at computer clubs and meetups (the places that people like Bill Gates and Steve Wozniak used to hang out). What’s now being worked on in these garages and shared at events like Maker Faire is a combination of invention and prototyping. We’re evolving from computer hardware and software into more tangible things (concept cars, robots, and more). The Maker Movement is closely tied to the rise of hackers, and people like Tim O’Reilly (founder of O’Reilly Media and advocate of the free software and open-source movements) have described these events as the most exciting ways to see what the future holds for humanity.
The company never pulled the trigger on their e-commerce project, and now they’re busy scrambling for “likes” on Facebook and are selling their products through the handful of big-box retailers left. Ironically, other, scrappier startups have disrupted this traditional retail model with digital-only brands that are capturing the imagination (and money) of consumers all over the world. WHAT APPLE KNOWS. What happened prior to 2001 that made Apple go into the retail business? Whenever the topic of Apple and the Apple retail experience (aka Apple Store) is brought up, many media pundits roll their eyes as if the success of these sparse and crisp stores is some kind of anomaly in business lore. It’s not. Apple came to a conclusion in the 1990s that many businesses have yet to wake up to. They knew that if potential customers walked into a traditional consumer electronics goods store and became inundated with a massive selection of computers and laptops, they would, instinctively, defer to the first sales associate they could wrestle down.
No one knows the value of simplicity and the power that it brings better than Ron Johnson. Prior to becoming the CEO of JCPenney, Johnson was the senior vice president of retail operations at Apple. In short, he led the concept of both the Apple retail stores and the Genius Bar. His record at Apple is pristine. Within two years of the first store opening, the retail operation of Apple surpassed a billion dollars in annual sales (beating the record held by The Gap). Globally, Apple now has over three hundred stores, and their expansion plans continue to be as aggressive as their product launches. In November 2011, Johnson left Apple to lead JCPenney through this time of purgatory and reboot. His first big and bold moves made news as the 110-year-old company not only struggles to remain relevant but fights within the constraints of the traditional retail world—a place where being an anchor store at a highly coveted shopping mall was the difference between success and failure.
Makers by Chris Anderson
3D printing, Airbnb, Any sufficiently advanced technology is indistinguishable from magic, Apple II, autonomous vehicles, barriers to entry, Buckminster Fuller, Build a better mousetrap, business process, commoditize, Computer Numeric Control, crowdsourcing, dark matter, David Ricardo: comparative advantage, death of newspapers, dematerialisation, Elon Musk, factory automation, Firefox, future of work, global supply chain, global village, IKEA effect, industrial robot, interchangeable parts, Internet of things, inventory management, James Hargreaves, James Watt: steam engine, Jeff Bezos, job automation, Joseph Schumpeter, Kickstarter, Lean Startup, manufacturing employment, Mark Zuckerberg, means of production, Menlo Park, Network effects, private space industry, profit maximization, QR code, race to the bottom, Richard Feynman, Ronald Coase, Rubik’s Cube, self-driving car, side project, Silicon Valley, Silicon Valley startup, Skype, slashdot, South of Market, San Francisco, spinning jenny, Startup school, stem cell, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, supply-chain management, The Nature of the Firm, The Wealth of Nations by Adam Smith, transaction costs, trickle-down economics, Whole Earth Catalog, X Prize, Y Combinator
Writing in Wired,12 Steven Levy explained the connection, which led to the original Apple II in 1977: His dad, Paul—a machinist who had never completed high school—had set aside a section of his workbench for Steve, and taught him how to build things, disassemble them, and put them together. From neighbors who worked in the electronics firm in the Valley, he learned about that field—and also understood that things like television sets were not magical things that just showed up in one’s house, but designed objects that human beings had painstakingly created. “It gave a tremendous sense of self-confidence, that through exploration and learning one could understand seemingly very complex things in one’s environment,” he told [an] interviewer. Later, when Jobs and his Apple cofounder, Steve Wozniak, were members of the Homebrew Computer Club, they saw the potential of desktop tools—in this case the personal computer—to change not just people’s lives, but also the world.
Yet when the truly personal—“desktop”—computer did eventually arrive with the Apple II and then the IBM PC, countless uses quickly emerged, starting with the spreadsheet and word processor for business and quickly moving to entertainment with video games and communications. This was not because the wise minds of the big computer companies had finally figured out why people would want one, but because people found new uses all by themselves. Then, in 1985, Apple released the LaserWriter, the first real desktop laser printer, which, along with the Mac, started the desktop publishing phenomenon. It was a jaw-dropping moment, combining in the public imagination words that had never gone together before: “desktop” and “publishing”! Famously, Apple’s printer had more processing power than the Mac itself, which was necessary to interpret the Postscript page description language that was originally designed for commercial printers costing ten times as much.
Take even the best company you can think of, say Apple, and consider how it hires. First, it’s based in the United States, and most of its employees are in Cupertino, California. So there’s a bias toward those who are already in the United States, or can legally work in the country, as well as toward those who live in the San Francisco Bay Area or are willing to move there. (It’s lovely in Cupertino, but if your spouse doesn’t want to leave her family in Rome or Chang Mai, that may matter more.) Like all companies, Apple favors people with experience in the industry it’s hiring for, and it likes to see degrees from good universities as an indication of intelligence and work ethic. Even though Steve Jobs was a genius teenage dropout, there aren’t many others like him at Apple. The company may “think different,” but these days it hires pretty much like every other good company: based on professional qualifications.
Humans Are Underrated: What High Achievers Know That Brilliant Machines Never Will by Geoff Colvin
Ada Lovelace, autonomous vehicles, Baxter: Rethink Robotics, Black Swan, call centre, capital asset pricing model, commoditize, computer age, corporate governance, creative destruction, deskilling, en.wikipedia.org, Freestyle chess, future of work, Google Glasses, Grace Hopper, industrial cluster, industrial robot, interchangeable parts, job automation, knowledge worker, low skilled workers, Marc Andreessen, meta analysis, meta-analysis, Narrative Science, new economy, rising living standards, self-driving car, sentiment analysis, Silicon Valley, Skype, social intelligence, Steve Jobs, Steve Wozniak, Steven Levy, Steven Pinker, theory of mind, Tim Cook: Apple, transaction costs
The finding that groups are more creative when their members trust one another helps explain a phenomenon frequently observed: that the most creative groups of all are often groups of two. The writer Joshua Wolf Shenk has pointed out the truly astounding number of pairs who have produced many of the world’s greatest creative successes. Think of John Lennon and Paul McCartney, Steve Jobs and Steve Wozniak, James Watson and Francis Crick, Jean-Paul Sartre and Simone de Beauvoir—once you get started you can think of them all day without even mentioning less known two-member teams like C. S. Lewis and J. R. R. Tolkien. Shenk argues that they all developed mutual trust so deep that it became faith in one another. “What I saw . . . in creative pairs was trust developing in concert as pairs took risks together,” he has observed, “like when Neal Brennan and Dave Chappelle pitched HBO an idea for a comedy show—and got shot down—or when Warren Buffett and Charlie Munger bought See’s Candy and turned a solid profit.”
Exhibit A was Apple’s top team under Steve Jobs. The conventional view of Apple’s success is that it derived from Jobs’s genius and dictatorial management, but Jobs knew that wasn’t nearly enough. He worked extraordinarily hard to assemble and keep a highly effective top team, which is an extremely difficult feat in a successful company. As the company prospers, other firms try to lure away its executives, usually with higher-level, higher-paying, more highly visible roles, and the temptation can be overwhelming. Nonetheless, by the time Jobs stepped down as CEO in August 2011, the six-executive inner circle he had assembled had been working as a team for thirteen years, meeting together for hours every week. This is virtually unheard of and appears to be unique among companies of Apple’s size and success.
Creativity comes from spontaneous meetings, from random discussions. You run into someone, you ask what they’re doing, you say ‘Wow,’ and soon you’re cooking up all sorts of ideas.’” It all has to happen in person. That’s why Jobs famously designed the Pixar headquarters the way he did. Pixar is the animation studio that Jobs initially funded and eventually ran in the years before he returned to Apple and for several years thereafter. It’s arguably the most successful film studio ever, since it has never produced a flop. The Toy Story films, Finding Nemo, the Cars films—of the fourteen features it had produced through 2013, every one was a major financial winner. Jobs wanted to keep it that way, so he insisted that Pixar’s new headquarters be designed around a central atrium; he then placed the café, mailboxes, conference rooms, and other elements so as to force people to criss-cross it.
Working in Public: The Making and Maintenance of Open Source Software by Nadia Eghbal
Amazon Web Services, barriers to entry, Benevolent Dictator For Life (BDFL), bitcoin, Clayton Christensen, cloud computing, commoditize, continuous integration, crowdsourcing, cryptocurrency, David Heinemeier Hansson, death of newspapers, Debian, disruptive innovation, en.wikipedia.org, Ethereum, Firefox, Guido van Rossum, Hacker Ethic, Induced demand, informal economy, Jane Jacobs, Jean Tirole, Kevin Kelly, Kickstarter, Kubernetes, Mark Zuckerberg, Menlo Park, Network effects, node package manager, Norbert Wiener, pirate software, pull request, RFC: Request For Comment, Richard Stallman, Ronald Coase, Ruby on Rails, side project, Silicon Valley, Snapchat, social graph, software as a service, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, The Death and Life of Great American Cities, The Nature of the Firm, transaction costs, two-sided market, urban planning, web application, wikimedia commons, Zimmermann PGP
Free software was counterculture, and it fell right in line with the burgeoning hacker culture of the times. The term “hacker” was popularized by author Steven Levy, who memorably captured a portrait of the 1980s hacker generation in the book Hackers: Heroes of the Computer Revolution. In Hackers, Levy profiles a number of well-known programmers of the time, including Bill Gates, Steve Jobs, Steve Wozniak, and Richard Stallman. He suggests that hackers believe in sharing, openness, and decentralization, which he calls the “hacker ethic.”17 According to Levy’s portrait, hackers care about improving the world, but don’t believe in following the rules to get there. Hackers are characterized by bravado, showmanship, mischievousness, and a deep mistrust of authority. Hacker culture still lives on today, in the way that beatniks, hippies, and Marxists still exist, but hackers don’t capture the software cultural zeitgeist in the same way that they used to.
When maintenance is involved, however, software’s marginal and temporal costs begin to add up. MARGINAL COSTS We believe that software is zero marginal cost due to the following properties, which together imply that additional copies are cheap to produce: NON-RIVALRY: If I download code from GitHub, my decision doesn’t diminish your ability to download that same code. (By contrast, if I bite into an apple and hand it to you, there is now less apple for you to eat.) NON-EXCLUDABILITY: If someone owns a copy of my code, it is difficult for me to prevent them from sharing it with others. (By contrast, if I build a theme park, I can prevent people from entering by putting up a turnstile and charging admission.) Technology policy writer David Bollier paints a rosy picture of what he calls the “information commons,” or online information goods, including open source software, under the commonly held belief that these are non-rival goods.
Just as there are Instagram influencers and Twitch streamers, there are GitHub developers. These activities take place away from platforms, too—you can still upload photos or videos of your Hawaii vacation to a self-hosted website—but why would you? For those hoping to reach an audience, platforms and creators have become inseparable. Platforms are often portrayed as being at odds with creators. App: The Human Story is a documentary about Apple App Store developers who struggle against the limitations of their platform.34 Facebook, meanwhile, is frequently accused of “eras[ing] a huge part of publishers’ audience” with the “stroke of an algorithm.”35 But for all the problems that platforms might have caused, they’ve also delivered immeasurable value. Today’s open source developers seem to genuinely love GitHub as a place to write, share, and discover code.
Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers by John MacCormick, Chris Bishop
Ada Lovelace, AltaVista, Claude Shannon: information theory, fault tolerance, information retrieval, Menlo Park, PageRank, pattern recognition, Richard Feynman, Silicon Valley, Simon Singh, sorting algorithm, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, traveling salesman, Turing machine, Turing test, Vannevar Bush
Over 50 years earlier—in 1939, with the world economy still reeling from the Great Depression—Hewlett-Packard got underway in Dave Hewlett's garage in Palo Alto, California. Several decades after that, in 1976, Steve Jobs and Steve Wozniak operated out of Jobs' garage in Los Altos, California, after founding their now-legendary Apple computer company. (Although popular lore has it that Apple was founded in the garage, Jobs and Wozniak actually worked out of a bedroom at first. They soon ran out of space and moved into the garage.) But perhaps even more remarkable than the HP and Apple success stories is the launch of a search engine called Google, which operated out of a garage in Menlo Park, California, when first incorporated as a company in September 1998. By that time, Google had in fact already been running its web search service for well over a year—initially from servers at Stanford University, where both of the cofounders were Ph.D. students.
(If the “.docx” here seems mysterious to you, check out the box on the facing page to find out about file name extensions.) Let's be very clear about one thing: in both cases, I'm running exactly the same computer program, which is Microsoft Word. It's just that the inputs are different in each case. Don't be fooled by the fact that all modern operating systems let you run a computer program by double-clicking on a document. That is just a convenience that your friendly computer company (most likely Apple or Microsoft) has provided you. When you double-click on a document, a certain computer program gets run, and that program uses the document as its input. The output of the program is what you see on the screen, and naturally it depends on what document you clicked on. Throughout this chapter, I'll be using file names like “abcd.txt.” The part after the period is called the “extension” of the file name—in this case, the extension of “abcd.txt” is “txt.”
See artificial intelligence algorithm: books on; criteria for greatness; definition of; future of; lack of; relationship to programming; significance of. See also addition algorithm; checksum; compression; digital signature; error-correcting code; Dijkstra's shortest-path algorithm; Euclid's algorithm; factorization; JPEG; key exchange; LZ77; matching; nine algorithms; PageRank; public key; ranking; RSA; web search AltaVista AlwaysYes.exe Amazon Analytical Engine AntiCrashOnSelf.exe AntiYesOnSelf.exe Apple artifact. See compression artificial intelligence. See also pattern recognition artificial neural network. See neural network As We May Think astronomy Atlantic magazine atomic. See transaction audio. See also compression Austen, Jane authentication authority: score; of a web page. See also certification authority authority trick B-tree Babylonia backup bank; account number; balance; for keys; online banking; for signatures; transfer; as trusted third party base, in exponentiation Battelle, John Bell Telephone Company binary Bing biology biometric sensor Bishop, Christopher bit block cipher body, of a web page brain Brin, Sergey British government browser brute force bug Burrows, Mike Bush, Vannevar Businessweek Byzantine fault tolerance C++ programming language CA.
A Curious Mind: The Secret to a Bigger Life by Brian Grazer, Charles Fishman
4chan, Airbnb, Albert Einstein, Apple II, Asperger Syndrome, Bonfire of the Vanities, en.wikipedia.org, game design, Google Chrome, Howard Zinn, Isaac Newton, Jeff Bezos, Kickstarter, Norman Mailer, orbital mechanics / astrodynamics, out of africa, RAND corporation, Ronald Reagan, Silicon Valley, stem cell, Steve Jobs, Steve Wozniak, the scientific method, Tim Cook: Apple
Williams: former police chief of Los Angeles Marianne Williamson: spiritual teacher, New Age guru Ian Wilmut: embryologist, led the team of researchers who first successfully cloned a mammal (a sheep named Dolly) E. O. Wilson: biologist, author, professor emeritus at Harvard University, two-time winner of the Pulitzer Prize Oprah Winfrey: founder and chairwoman of the Oprah Winfrey Network, actress, author George C. Wolfe: playwright, theater director, two-time winner of the Tony Award Steve Wozniak: cofounder of Apple Inc., designer of Apple I and Apple II computers, inventor John D. Wren: president and CEO of marketing and communications company Omnicom Will Wright: game designer, creator of Sim City and The Sims Steve Wynn: businessman, Las Vegas casino magnate Gideon Yago: writer, former correspondent for MTV News Eitan Yardeni: teacher and spiritual counselor at the Kabbalah Centre Daniel Yergin: economist, author of The Prize: The Epic Quest for Oil, Money and Power, winner of the Pulitzer Prize Dan York: chief content officer at DirecTV, former president of content and advertising sales, AT&T Michael W.
Simpson Jared Cohen: director of Google Ideas Joel Cohen: population specialist, mathematical biologist Kat Cohen: university admissions counselor, author of The Truth About Getting In William Colby: CIA director, 1973–1976 Elizabeth Baron Cole: nutritionist Jim Collins: management consultant, expert on business and management, author of Good to Great Robert Collins: neurologist, former chairman of neurology at UCLA School of Medicine Sean Combs: musician, music producer, fashion designer, entrepreneur Richard Conniff: author who specializes in human and animal behavior Tim Cook: CEO of Apple, Inc. Tatiana Cooley-Marquardt: repeat winner of USA Memory Championship Anderson Cooper: journalist, author, TV personality, anchor of CNN’s Anderson Cooper 360 Norman Cousins: medical guru, author of Anatomy of an Illness: As Perceived by the Patient Jacques Cousteau: oceanographer, pioneered marine conservation Chris W. Cox: chief lobbyist for the National Rifle Association Steve Coz: former editor of National Enquirer Donald Cram: professor of chemistry at UCLA, Nobel laureate in chemistry Jim Cramer: investor, author, TV personality, host of CNBC’s Mad Money Clyde Cronkhite: criminal justice expert, former police chief of Santa Ana, former deputy police chief of Los Angeles Mark Cuban: investor, owner of the NBA’s Dallas Mavericks Heidi Siegmund Cuda: journalist, former music critic for the Los Angeles Times Thomas Cummings: leading expert in designing high-performing organizations and strategic change at USC Marshall School of Business Fred Cuny: disaster relief specialist Mario Cuomo: governor of New York, 1983–1994 Alan Dershowitz: attorney, constitutional scholar, professor emeritus at Harvard Law School Donny Deutsch: advertising executive, TV personality Jared Diamond: evolutionary biologist, author, professor at UCLA, winner of the Pulitzer Prize Alfred “Fred” DiSipio: record promoter investigated during payola scandal DMX: musician, actor Thomas R.
News & World Report, winner of the Pulitzer Prize for investigative reporting Jack Healey: human rights activist, former executive director of Amnesty International USA Thomas Heaton: seismologist, professor at California Institute of Technology, contributed to the development of earthquake early warning systems Peter Herbst: journalist, former editor of Premiere and New York magazines Danette Herman: talent executive for Academy Awards Seymour Hersh: investigative reporter, author, winner of the Pulitzer Prize for uncovering the My Lai massacre and its cover-up during the Vietnam War Dave Hickey: art and cultural critic who has written for Harper’s, Rolling Stone, and Vanity Fair Jim Hightower: progressive political activist, radio talk-show host Tommy Hilfiger: fashion designer, founder of lifestyle brand Christopher Hitchens: journalist and author who was a critic of politics and religion David Hockney: artist and major contributor to the Pop art movement in the 1960s Nancy Irwin: hypnotherapist Chris Isaak: musician, actor Michael Jackson: singer, songwriter, his 1982 album Thriller is the bestselling album of all time LeBron James: NBA basketball player Mort Janklow: literary agent, founder and chairman of the literary agency Janklow & Nesbit Associates Jay Z: musician, music producer, fashion designer, entrepreneur Wyclef Jean: musician, actor James Jebbia: CEO of the Supreme clothing brand Harry J. Jerison: paleoneurologist, professor emeritus at UCLA Steve Jobs: cofounder and former CEO of Apple Inc., cofounder and former CEO of Pixar Betsey Johnson: fashion designer Jamie Johnson: documentary filmmaker who directed Born Rich, heir to Johnson & Johnson fortune Larry C. Johnson: former analyst for the CIA, security and terrorism consultant Robert L. Johnson: businessman, media magnate, cofounder and former chairman of BET Sheila Johnson: cofounder of BET, first African American woman to be an owner/partner in three professional sports teams Steve Johnson: media theorist, popular science author, cocreated online magazine FEED Jackie Joyner-Kersee: Olympic gold medalist, track star Paul Kagame: president of Rwanda Michiko Kakutani: book critic for the New York Times, winner of the Pulitzer Prize for criticism Sam Hall Kaplan: former architecture critic for the Los Angeles Times Masoud Karkehabadi: wunderkind who graduated from college at age thirteen Patrick Keefe: author, staff writer for the New Yorker Gershon Kekst: founder of the corporate communications company Kekst and Co.
The End of Big: How the Internet Makes David the New Goliath by Nicco Mele
4chan, A Declaration of the Independence of Cyberspace, Airbnb, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, Apple's 1984 Super Bowl advert, barriers to entry, Berlin Wall, big-box store, bitcoin, business climate, call centre, Cass Sunstein, centralized clearinghouse, Chelsea Manning, citizen journalism, cloud computing, collaborative consumption, collaborative editing, commoditize, creative destruction, crony capitalism, cross-subsidies, crowdsourcing, David Brooks, death of newspapers, disruptive innovation, Donald Trump, Douglas Engelbart, Douglas Engelbart, en.wikipedia.org, Exxon Valdez, Fall of the Berlin Wall, Filter Bubble, Firefox, global supply chain, Google Chrome, Gordon Gekko, Hacker Ethic, Jaron Lanier, Jeff Bezos, jimmy wales, John Markoff, Julian Assange, Kevin Kelly, Khan Academy, Kickstarter, Lean Startup, Mark Zuckerberg, minimum viable product, Mitch Kapor, Mohammed Bouazizi, Mother of all demos, Narrative Science, new economy, Occupy movement, old-boy network, peer-to-peer, period drama, Peter Thiel, pirate software, publication bias, Robert Metcalfe, Ronald Reagan, Ronald Reagan: Tear down this wall, sharing economy, Silicon Valley, Skype, social web, Steve Jobs, Steve Wozniak, Stewart Brand, Stuxnet, Ted Nelson, Telecommunications Act of 1996, telemarketer, The Wisdom of Crowds, transaction costs, uranium enrichment, Whole Earth Catalog, WikiLeaks, Zipcar
One group, the People’s Computer Company, put this explanation on the cover of its newsletter: “Computers are mostly used against people instead of for people; used to control people instead of to free them; Time to change all that—we need a … People’s Computer Company.”13 It all amounted to a sharp departure from mainstream computer science in America, which lived on in the giant mainframes of academic and government institutions. A famous example of the burgeoning anti-institutional computer counterculture is the Homebrew Computer Club, an ad hoc group of hobbyist nerds who in 1975 began meeting once a month in Gordon French’s garage in Silicon Valley. Some of its more famous members included the Apple founders Steve Jobs and Steve Wozniak.14 Gates drew the ire of the Homebrew Computer Club by selling something that had previously been given away free—a terrible development for hobbyists. Microsoft’s first software product, Altair BASIC, was sold at a time that software was generally bundled with a hardware purchase. Homebrew members famously started to circulate illegal copies of the software at the group’s meetings—arguably the first instance of pirating software.
A liberationist ethic also became entrenched in the overt marketing of personal computing devices, most famously in a classic television commercial, Apple’s 1984 spot. Following the rousing success of Apple’s first two home computer models, Steve Jobs wanted to do something big to roll out its third model, the Macintosh personal computer. He hired Ridley Scott, who two years earlier had directed the sci-fi classic Bladerunner, to make the commercial.18 The result was a powerful and intense ad that referenced the dystopian future of George Orwell’s classic novel 1984. In the ad, a young woman breaks into a large auditorium where a crowd of mindless automatons sit listening to a giant screen of a speaking man, presumably Big Brother. The woman, representing the Macintosh (she has a sketch of the Mac on her tank top), smashes the screen. The advertisement closes with the text, “On January 24th, Apple Computer will introduce Macintosh.
But it’s easy to imagine that ten years from now, every home will have a 3-D printer, just like every home today has a microwave. Over time, these 3-D printers will grow more advanced. Once nanotechnology hits its stride, 3-D printers will build complex machines such as an iPhone in your own home. In fact, Apple is already preparing for this future. Take the iPhone you’ve got in your pocket right now and turn it over. You’ll see that it says in fine print, “Designed by Apple in California. Assembled in China.” Apple understands that the design is the important part of what companies do, not manufacturing, and it is already staking its claim to the design. The Coming Plague of Shanzhai On-demand fabrication has the potential to do a lot of good around the world. A group of smarties at MIT has developed the Fab Lab—a kit of about $20,000 worth of equipment that allows all kinds of things to be manufactured on-demand.
The Internet Is Not the Answer by Andrew Keen
"Robert Solow", 3D printing, A Declaration of the Independence of Cyberspace, Airbnb, AltaVista, Andrew Keen, augmented reality, Bay Area Rapid Transit, Berlin Wall, bitcoin, Black Swan, Bob Geldof, Burning Man, Cass Sunstein, citizen journalism, Clayton Christensen, clean water, cloud computing, collective bargaining, Colonization of Mars, computer age, connected car, creative destruction, cuban missile crisis, David Brooks, disintermediation, disruptive innovation, Donald Davies, Downton Abbey, Edward Snowden, Elon Musk, Erik Brynjolfsson, Fall of the Berlin Wall, Filter Bubble, Francis Fukuyama: the end of history, Frank Gehry, Frederick Winslow Taylor, frictionless, full employment, future of work, gig economy, global village, Google bus, Google Glasses, Hacker Ethic, happiness index / gross national happiness, income inequality, index card, informal economy, information trail, Innovator's Dilemma, Internet of things, Isaac Newton, Jaron Lanier, Jeff Bezos, job automation, Joi Ito, Joseph Schumpeter, Julian Assange, Kevin Kelly, Kickstarter, Kodak vs Instagram, Lean Startup, libertarian paternalism, lifelogging, Lyft, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Martin Wolf, Metcalfe’s law, move fast and break things, move fast and break things, Nate Silver, Nelson Mandela, Network effects, new economy, Nicholas Carr, nonsequential writing, Norbert Wiener, Norman Mailer, Occupy movement, packet switching, PageRank, Panopticon Jeremy Bentham, Paul Graham, peer-to-peer, peer-to-peer rental, Peter Thiel, plutocrats, Plutocrats, Potemkin village, precariat, pre–internet, RAND corporation, Ray Kurzweil, ride hailing / ride sharing, Robert Metcalfe, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Skype, smart cities, Snapchat, social web, South of Market, San Francisco, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, TaskRabbit, Ted Nelson, telemarketer, The Future of Employment, the medium is the message, the new new thing, Thomas L Friedman, Travis Kalanick, Tyler Cowen: Great Stagnation, Uber for X, uber lyft, urban planning, Vannevar Bush, Whole Earth Catalog, WikiLeaks, winner-take-all economy, working poor, Y Combinator
In particular, I’d like to thank Kurt Andersen, John Borthwick, Stewart Brand, Po Bronson, Erik Brynjolfsson, Nicholas Carr, Clayton Christensen, Ron Conway, Tyler Cowen, Kenneth Cukier, Larry Downes, Tim Draper, Esther Dyson, George Dyson, Walter Isaacson, Tim Ferriss, Michael Fertik, Ze Frank, David Frigstad, James Gleick, Seth Godin, Peter Hirshberg, Reid Hoffman, Ryan Holiday, Brad Horowitz, Jeff Jarvis, Kevin Kelly, David Kirkpatrick, Ray Kurzweil, Jaron Lanier, Robert Levine, Steven Levy, Viktor Mayer-Schönberger, Andrew McAfee, Gavin Newsom, George Packer, Eli Pariser, Andrew Rasiej, Douglas Rushkoff, Chris Schroeder, Tiffany Shlain, Robert Scoble, Dov Seidman, Gary Shapiro, Clay Shirky, Micah Sifry, Martin Sorrell, Tom Standage, Bruce Sterling, Brad Stone, Clive Thompson, Sherry Turkle, Fred Turner, Yossi Vardi, Hans Vestberg, Vivek Wadhwa, and Steve Wozniak for appearing on Keen On . . . and sharing their valuable ideas with me. NOTES Preface 1 The Cult of the Amateur: How Today’s Internet Is Killing Our Culture (New York: Currency/Doubleday, 2007), and Digital Vertigo: How Today’s Online Social Revolution Is Dividing, Diminishing, and Disorienting Us (New York: St. Martin, 2012). Introduction 1 Carolyne Zinko, “New Private S.F. Club the Battery,” SFGate, October 4, 2013. 2 Renée Frojo, “High-Society Tech Club Reborn in San Francisco,” San Francisco Business Times, April 5, 2013. 3 The Battery describes itself on its website: “Indeed, here is where they came to refill their cups.
Google is around seven times larger than GM, but employs less than a quarter of the number of workers. This new data factory economy changes everything—even the monetary supply of our global financial system. In early 2014, the global economy’s top five companies’ gross cash holdings—those of Apple, Google, Microsoft, as well as the US telecom giant Verizon and the Korean electronics conglomerate Samsung—came to $387 billion, the equivalent of the 2013 GDP of the United Arab Emirates.78 This capital imbalance puts the fate of the world economy in the hands of the few cash hoarders like Apple and Google, whose profits are mostly kept offshore to avoid paying US tax. “Apple, Google and Facebook are latter-day scrooges,” worries the Financial Times columnist John Plender about a corporate miserliness that is undermining the growth of the world economy.79 “So what does it all mean?”
See also “The Square People, Part Two,” New York Times, May 17, 2014. 2 Edward Luce, “America Must Dump Its Disrupters in 2014,” Financial Times, December 22, 2014. 3 Nick Cohen, “Beware the Lure of Mark Zuckerberg’s Cool Capitalism,” Observer, March 30, 2013. 4 Fred Turner, From Counterculture to Cyberculture (University of Chicago Press, 2008). 5 For more on Apple, Steve Jobs, and Foxconn, see my TechCrunchTV interview with Mike Daisey, who starred in the Broadway hit The Agony and Ecstasy of Steve Jobs: “Apple and Foxconn, TechCrunchTV, February 1, 2011. 6 Lyn Stuart Parramore, “What Does Apple Really Owe Taxpayers? A Lot, Actually,” Reuters, June 18, 2013. 7 Jo Confino, “How Technology Has Stopped Evolution and Is Destroying the World,” Guardian, July 11, 2013. 8 Alexis C. Madrigal, “Camp Grounded, ‘Digital Detox,’ and the Age of Techno-Anxiety,” Atlantic, July 2013. 9 Oliver Burkman, “Conscious Computing: How to Take Control of Our Life Online,” Guardian, May 10, 2013. 10 Jemima Kiss, “An Online Magna Carta: Berners-Lee Calls for Bill of Rights for Web,” Guardian, March 11, 2014. 11 “Bitcloud Developers Plan to Decentralize Internet,” BBC Technology News, January 23, 2014. 12 Suzanne Labarre, “Why We’re Shutting Off Our Comments,” Popular Science, September 24, 2013; Elizabeth Landers, “Huffington Post to Ban Anonymous Comments,” CNN, August 22, 2013. 13 “Data Protection: Angela Merkel Proposes Europe Network,” BBC News, February 15, 2014. 14 Philip Oltermann, “Germany ‘May Revert to Typewriters’ to Counter Hi-Tech Espionage,” Guardian, July 15, 2014. 15 Lanier, Who Owns the Future?
Cult of the Dead Cow: How the Original Hacking Supergroup Might Just Save the World by Joseph Menn
4chan, A Declaration of the Independence of Cyberspace, Apple II, autonomous vehicles, Berlin Wall, Bernie Sanders, bitcoin, Chelsea Manning, commoditize, corporate governance, Donald Trump, dumpster diving, Edward Snowden, Firefox, Google Chrome, Haight Ashbury, Internet of things, Jacob Appelbaum, Jason Scott: textfiles.com, John Markoff, Julian Assange, Mark Zuckerberg, Mitch Kapor, Naomi Klein, Peter Thiel, pirate software, pre–internet, Ralph Nader, ransomware, Richard Stallman, Robert Mercer, self-driving car, side project, Silicon Valley, Skype, slashdot, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Stuxnet, Whole Earth Catalog, WikiLeaks, zero day
Rosenbaum had spent serious time with the phone phreakers, the forerunners of today’s hackers, and he explained what they were doing in plain English. The phreakers were a diverse group, including John Draper, who called himself Cap’n Crunch after learning that whistles given out with that breakfast cereal could be used to blow 2600 hertz, which allowed free calls. The technical puzzles of phreaking would attract future innovators up to and including Apple founders Steve Jobs and Steve Wozniak, who sold blue boxes to make free calls while in college. The political divide in America at the end of the 1960s was the worst until the 2000s, and that helped push phreaking in a radical direction. The phone companies were very clearly part of the establishment, and AT&T was a monopoly to boot. That made it a perfect target for the antiwar left and anyone who thought stealing from some companies was more ethical than stealing from others.
On defense, Christien Rioux and Wysopal started Veracode, which analyzed programs for flaws using an automated system dreamed up by Christien in order to make his regular work easier. After Microsoft, Window Snyder went to Apple. Apple’s software had fewer holes than Microsoft’s, but its customers were more valuable, since they tended to have more money. Snyder looked at the criminal ecosystem for chokepoints where she could make fraud more difficult. One of her innovations was to require a developer certificate, which cost $100, to install anything on an iPhone. It wasn’t a lot of money, but it was enough of a speed bump that it became economically unviable for criminals to ship malware in the same way. Going deeper, Snyder argued that criminals would target Apple users less if the company held less data about them. But more data also made for a seamless user experience, a dominant theme at Apple, and executives kept pressing Snyder for evidence that consumers cared.
“It was made easier when people started freaking out about Snowden,” Snyder said. “When people really understand it, they care.” In large part due to Snyder, Apple implemented new techniques that rendered iPhones impenetrable to police and to Apple itself, to the great frustration of the FBI. It was the first major technology company to declare that it had to consider itself a potential adversary to its customers, a real breakthrough in threat modeling. Still later, Snyder landed in a senior security job at top chipmaker Intel. David Litchfield feuded publicly with Oracle over the database giant’s inflated claims of security. He went on to increasingly senior security jobs at Google and Apple. @stake’s Katie Moussouris, a friend to cDc, stayed on at new owner Symantec and then moved to Microsoft, where she got the company to join other software providers in paying bounties to hackers who found and responsibly reported significant flaws.
To Be a Machine: Adventures Among Cyborgs, Utopians, Hackers, and the Futurists Solving the Modest Problem of Death by Mark O'Connell
3D printing, Ada Lovelace, AI winter, Airbnb, Albert Einstein, artificial general intelligence, brain emulation, clean water, cognitive dissonance, computer age, cosmological principle, dark matter, disruptive innovation, double helix, Edward Snowden, effective altruism, Elon Musk, Extropian, friendly AI, global pandemic, impulse control, income inequality, invention of the wheel, Jacques de Vaucanson, John von Neumann, knowledge economy, Law of Accelerating Returns, life extension, lifelogging, Lyft, Mars Rover, means of production, Norbert Wiener, Peter Thiel, profit motive, Ray Kurzweil, RFID, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Singularitarianism, Skype, Stephen Hawking, Steve Wozniak, superintelligent machines, technological singularity, technoutopianism, The Coming Technological Singularity, Travis Kalanick, trickle-down economics, Turing machine, uber lyft, Vernor Vinge
It behooves us to give them every advantage and to bow out when we can no longer contribute.” There is, obviously, something about the idea of intelligent robots that frightens and titillates us, that fuels our feverish visions of omnipotence and obsolescence. The technological imagination projects a fantasy of godhood, with its attendant Promethean anxieties, onto the figure of the automaton. A few days after I returned from Pomona, I read that Steve Wozniak, the cofounder of Apple, had spoken at a conference about his conviction that humans were destined to become the pets of superintelligent robots. But this, he stressed, would not necessarily be an especially undesirable outcome. “It’s actually going to turn out really good for humans,” he said. Robots “will be so smart by then that they’ll know they have to keep nature, and humans are part of nature.” The robots, he believed, would treat us with respect and kindness, with a patrician generosity, because we humans were “the gods originally.”
When I thought of DARPA, I thought, among other things, of its administration of the so-called Information Awareness Office, exposed by the former CIA employee Edward Snowden as a mass surveillance operation organized around a database for the collection and storage of the personal information (emails, telephone records, social networking messages, credit card and banking transactions) of every single resident of the United States, along with those of many other countries, all of which was accomplished by tapping into the user data of marquee-name tech companies like Facebook, Apple, Microsoft, Skype, Google—the corporate proprietors of the sum of things that might be factually and usefully said about you, your information. “Look at him go!” said Prabhakar now, as the robot rounded the second safety barrier, bringing the car over a line in the sand, and gently to a halt in front of the door through which the industrial disaster zone stage set was to be accessed by means of knob-turning.
With me facing the prospect of death and all that? But yeah, I’m not sure it really works.” I glanced down at the book in front of us on the table, and did not disagree. We were sitting in a courtyard at the rear of the café, in blinding midday sunlight. The tables were all occupied, I noted, and yet we were the only people conducting a conversation. Every other customer in the café was alone, and typing on an Apple laptop. As so often in San Francisco, I had a sense of being embedded in some hyperreal simulation of a corporate utopia—or, rather, a heavy-handed parody of such a thing. As a scene, it felt a little overbearing in its symbolism. This is one of the problems with reality: the extent to which it resembles bad fiction. “I’ve seen worse book covers,” I said, which for all I knew might have been the truth.
Our Final Invention: Artificial Intelligence and the End of the Human Era by James Barrat
AI winter, AltaVista, Amazon Web Services, artificial general intelligence, Asilomar, Automated Insights, Bayesian statistics, Bernie Madoff, Bill Joy: nanobots, brain emulation, cellular automata, Chuck Templeton: OpenTable:, cloud computing, cognitive bias, commoditize, computer vision, cuban missile crisis, Daniel Kahneman / Amos Tversky, Danny Hillis, data acquisition, don't be evil, drone strike, Extropian, finite state, Flash crash, friendly AI, friendly fire, Google Glasses, Google X / Alphabet X, Isaac Newton, Jaron Lanier, John Markoff, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, Loebner Prize, lone genius, mutually assured destruction, natural language processing, Nicholas Carr, optical character recognition, PageRank, pattern recognition, Peter Thiel, prisoner's dilemma, Ray Kurzweil, Rodney Brooks, Search for Extraterrestrial Intelligence, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, smart grid, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, superintelligent machines, technological singularity, The Coming Technological Singularity, Thomas Bayes, traveling salesman, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, zero day
Puzzles so difficult that we can’t help but make mistakes, like playing Jeopardy! and deriving Newton’s second law of thermodynamics, fall in seconds to well-programmed AI. At the same time, no computer vision system can tell the difference between a dog and a cat—something most two-year-old humans can do. To some degree these are apples-and-oranges problems, high-level cognition versus low-level sensor motor skill. But it should be a source of humility for AGI builders, since they aspire to master the whole spectrum of human intelligence. Apple cofounder Steve Wozniak has proposed an “easy” alternative to the Turing test that shows the complexity of simple tasks. We should deem any robot intelligent, Wozniak says, when it can walk into any home, find the coffeemaker and supplies, and make us a cup of coffee. You could call it the Mr. Coffee Test.
So, Kurzweil’s more general law, the Law of Accelerating Returns, is a better fit. And more technologies are becoming information technologies, as computers, and even robots, grow ever more intimately involved with every aspect of product design, manufacture, and sales. Consider that every smart phone’s manufacture—not just its processor chip—took advantage of the digital revolution. It’s been just six years since Apple’s iPhone first came out, and Apple has released six versions. Apple has more than doubled its speed and for most users halved its price, or better. That’s because hardware speed has been regularly doubling in the components within the end product. But it’s also been doubling in every link in the production pipeline that led to its creation. The effects anticipated by LOAR reach far beyond the computer and smart phone businesses.
An alternative is to install the intelligent agent in a robot, to continue its education and fulfill its programmed goals in the real world. Another is to use the agent AI to augment a human brain. Broadly speaking, those who believe intelligence must be embodied hold that knowledge itself is grounded in sensory and motor experiences. Cognitive processing cannot take place without it. Learning facts about apples, they claim, will never make you intelligent, in a human sense, about an apple. You’ll never develop a “concept” of an apple from reading or hearing about one—concept forming requires that you smell, feel, see, and taste—the more the better. In AI this is known as the “grounding problem.” Consider some systems whose powerful cognitive abilities lie somewhere beyond narrow AI but fall short of AGI. Recently, Hod Lipson at Cornell University’s Computational Synthesis Lab developed software that derives scientific laws from raw data.
Live Work Work Work Die: A Journey Into the Savage Heart of Silicon Valley by Corey Pein
23andMe, 4chan, affirmative action, Affordable Care Act / Obamacare, Airbnb, Amazon Mechanical Turk, Anne Wojcicki, artificial general intelligence, bank run, barriers to entry, Benevolent Dictator For Life (BDFL), Bernie Sanders, bitcoin, Build a better mousetrap, California gold rush, cashless society, colonial rule, computer age, cryptocurrency, data is the new oil, disruptive innovation, Donald Trump, Douglas Hofstadter, Elon Musk, Extropian, gig economy, Google bus, Google Glasses, Google X / Alphabet X, hacker house, hive mind, illegal immigration, immigration reform, Internet of things, invisible hand, Isaac Newton, Jeff Bezos, job automation, Kevin Kelly, Khan Academy, Law of Accelerating Returns, Lean Startup, life extension, Lyft, Mahatma Gandhi, Marc Andreessen, Mark Zuckerberg, Menlo Park, minimum viable product, move fast and break things, move fast and break things, mutually assured destruction, obamacare, passive income, patent troll, Paul Graham, peer-to-peer lending, Peter H. Diamandis: Planetary Resources, Peter Thiel, platform as a service, plutocrats, Plutocrats, Ponzi scheme, post-work, Ray Kurzweil, regulatory arbitrage, rent control, RFID, Robert Mercer, rolodex, Ronald Reagan, Ross Ulbricht, Ruby on Rails, Sam Altman, Sand Hill Road, Scientific racism, self-driving car, sharing economy, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, Skype, Snapchat, social software, software as a service, source of truth, South of Market, San Francisco, Startup school, stealth mode startup, Steve Jobs, Steve Wozniak, TaskRabbit, technological singularity, technoutopianism, telepresence, too big to fail, Travis Kalanick, tulip mania, Uber for X, uber lyft, ubercab, upwardly mobile, Vernor Vinge, X Prize, Y Combinator
Francis, the other guest, who was currently staying in the indoor bedroom, was an Englishman from Portsmouth, a town I knew. He was about to move to London to take a job with a startup that projected Web streams on walls at tech conferences. It sounded stupid, but I congratulated him all the same. This was his dream vacation in America—his “techie pilgrimage” around Silicon Valley. So far Francis had visited Steve Jobs’s old house; the garage where Apple cofounder Steve Wozniak built the first Apple computer; the Xerox PARC laboratory, where many modern features of consumer computers, such as the graphical user interface, had been invented with government support; the Hewlett-Packard campus; and the Googleplex, which was a stone’s throw from Jeannie’s place. Marveling at the many golden-hued wonders over every desiccated California hilltop, Francis saw signs of genius everywhere he looked.
And a majority lacked trust in organized religion, the medical system, the presidency, the Supreme Court, public schools, and newspapers. Four in five distrusted the economic institutions of organized labor, banks, and big business. However, another poll, also taken in 2012, found that 82 percent of Americans had a favorable opinion of Google. Two-thirds felt favorably toward Apple. Nearly three in five approved of Facebook. Zuckerberg was bigger than Jesus. In 2015, Gallup reran its poll and found that confidence in government and most other institutions had fallen even lower, while another poll of consumer brands showed that Google, Apple, and Facebook had maintained the same high levels of public admiration. Americans hate the government, and they don’t much like big corporations, either, but years of propaganda have convinced them that the tech companies are somehow different, that Silicon Valley nobility are uniquely enlightened, benevolent, and cool—not your average jerk billionaires!
Please use the search function on your e-reading device to search for terms of interest. For your reference, the terms that appear in the print index are listed below. Adolphe, Eric Ad:Tech Aeiveos Airbnb Alcor Alexander, Steve Aloise, Rome Alphabet Altman, Sam Amazon American Conservative Ancestry.com Anderson, Kyle Andreessen, Marc Andreessen-Horowitz Anduril Angry Birds Anissimov, Michael Apple Apple iTunes Architectural Digest ARPANET Ask.fm Associated Press Auerbach, David Auletta, Ken Bank of America Bannon, Steve Benthall, Blake Bezos, Jeff Bharara, Preet BIL Conference BioCurious Biogen Bitcoin BitTorrent Blogger Bloomberg News Blum, Richard C. “Dick” BMW Boeing Booz Allen Hamilton Borges, Jorge Luis Bradbury, Robert J. Breitbart News Breivik, Anders Brin, Sergey Burner Burnham, John Bush, George W.
The Slow Fix: Solve Problems, Work Smarter, and Live Better in a World Addicted to Speed by Carl Honore
Albert Einstein, Atul Gawande, Broken windows theory, call centre, Checklist Manifesto, clean water, clockwatching, cloud computing, crowdsourcing, Dava Sobel, delayed gratification, drone strike, Enrique Peñalosa, Erik Brynjolfsson, Ernest Rutherford, Exxon Valdez, fundamental attribution error, game design, income inequality, index card, invention of the printing press, invisible hand, Isaac Newton, Jeff Bezos, John Harrison: Longitude, lateral thinking, lone genius, medical malpractice, microcredit, Netflix Prize, planetary scale, Ralph Waldo Emerson, RAND corporation, shareholder value, Silicon Valley, Skype, stem cell, Steve Jobs, Steve Wozniak, the scientific method, The Wisdom of Crowds, ultimatum game, urban renewal, War on Poverty
William Wordsworth described Newton as “A mind for ever/Voyaging through strange seas of Thought, alone.” Every major religion has prophets – Buddha, Muhammad, Moses – who went out into the wilderness to grapple with the big questions on their own. Picasso once said, “Without great solitude, no serious work is possible.” This remains true in the modern, high-tech world. In his memoir, Steve Wozniak described how he built the first two Apple computers working by himself late into the night: “Most inventors and engineers I’ve met are like me … they live in their heads. They’re almost like artists … And artists work best alone.” That is why we all need to be careful when tapping the crowd. Ask yourself if your problem really will benefit from being thrown open to everybody. If it can, take the time to figure out exactly the right question to ask and how to manage and reward the crowd.
“The vision and passion of a core creator is essential,” he says. Though Apple relies on collaboration and teamwork to forge its game-changing gadgets, it also encourages project leaders to act as “auteurs,” who lead from the front and stamp their personality all over the final product. Jonathan Ive was so central to designing the iMac, iPod and iPad that he is sometimes credited with inventing the devices. And then there was the auteur-in-chief, Steve Jobs. Friends and foes likened his knack for winning over people to a “reality distortion field.” His keynote speeches were hailed as master-classes in the art of persuasion. By the time he died in 2011, Jobs had achieved the kind of rock star status seldom granted to CEOs, with fans leaving flowers, messages and even apples with a bite taken out of them at Apple stores around the world. When it came to forging a Slow Fix for Bogotá, the city leaned heavily on two visionary mayors who served back-to-back terms starting in 1995.
Flaubert wrote 52 versions of it before finally hitting on the perfect arrangement of words. Like McNutt, his motto was: “The good God is in the detail.” Steve Jobs, founder and former CEO of Apple, took that creed to the level of obsessive compulsion. Towards the end of his life, as he lay dying in hospital, he burned through 67 nurses before settling on three that met his exacting standards. Even when heavily sedated, he tore an oxygen mask off his face to object to the way it looked. The pulmonologist was startled when Jobs demanded to see five other mask designs so he could pick the one he liked best. Yet what sounds like a rampant case of OCD helped turn Apple into one of the most successful companies in history. Deadlines came and went as Jobs drove his designers, engineers and marketers to get every detail just right.
Doing Good Better: How Effective Altruism Can Help You Make a Difference by William MacAskill
barriers to entry, basic income, Black Swan, Branko Milanovic, Cal Newport, Capital in the Twenty-First Century by Thomas Piketty, carbon footprint, clean water, corporate social responsibility, correlation does not imply causation, Daniel Kahneman / Amos Tversky, David Brooks, effective altruism, en.wikipedia.org, end world poverty, experimental subject, follow your passion, food miles, immigration reform, income inequality, index fund, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, job automation, job satisfaction, Lean Startup, M-Pesa, mass immigration, meta analysis, meta-analysis, microcredit, Nate Silver, Peter Singer: altruism, purchasing power parity, quantitative trading / quantitative ﬁnance, randomized controlled trial, self-driving car, Skype, Stanislav Petrov, Steve Jobs, Steve Wozniak, Steven Pinker, The Future of Employment, The Wealth of Nations by Adam Smith, universal basic income, women in the workforce
He traveled in India, took plenty of LSD, shaved his head, wore robes, and seriously considered moving to Japan to become a monk. He first got into electronics only reluctantly, as a way to earn cash on the side, helping his tech-savvy friend Steve Wozniak handle business deals while also spending time at the All-One Farm. Even Apple Computer’s very existence was fortuitous: while Jobs and Wozniak were trying to sell circuit boards to hobbyists, the owner of one local computer store said he would buy fully assembled computers, and they jumped at the chance to make more money. It was only once they started to gain traction and success that Jobs’s passion for Apple and computing really bloomed. What about following your heart, your gut, or your itch to find work you love? The evidence suggests that won’t work, either, since we’re bad at predicting what will make us happy.
For health benefits, economists: In this discussion I’ve left out the most common metric used by economists to measure harms and benefits, which is called willingness to pay. According to this metric, the size of a benefit of something to a person is measured by how much that person is willing to pay for it. If Jones is willing to pay a dollar for an apple, but Smith is willing to pay ten dollars for an apple, then, if we used this metric, we would conclude that giving Smith an apple provides ten times as great a benefit as providing Jones with an apple. The reason I don’t rely on this metric is that it treats an additional dollar as being of equal worth no matter who has it. But this is clearly wrong. If Smith is a multimillionaire, whereas Jones is poor, then one dollar will be much less valuable to Smith than a dollar is to Jones. This problem becomes particularly severe if we try to compare activities that benefit people in rich countries with activities that benefit people in poor countries.
Suppose you’re deciding whether to buy a Mac or a PC. What factors would you consider? You’d probably think about the design and usability of the two computers, the hardware, the software, and the price. You certainly wouldn’t think about how much Apple and Microsoft each spend on administration, and you wouldn’t think about how much their respective CEOs are paid. Why would you? As a consumer you only care about the product you get with the money you spend; details about the financials of the companies who make the products are almost always irrelevant. If Apple spent a lot of money to attract a more talented management team, you might even consider that a good sign that their products were the best on the market! If we don’t care about financial information when we buy products for ourselves, why should we care about financial information when we buy products for other people?
The Happiness Curve: Why Life Gets Better After 50 by Jonathan Rauch
endowment effect, experimental subject, Google bus, happiness index / gross national happiness, hedonic treadmill, income per capita, job satisfaction, longitudinal study, loss aversion, Richard Thaler, science of happiness, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, upwardly mobile, World Values Survey, zero-sum game
It’s possibly the most important question in all of social science.” As Joshua Wolf Shenk has observed in his book Powers of Two: How Relationships Drive Creativity, a lot of the greatest creativity is the result of creative dyads, partnerships in which two very different people complement each other and become a sort of super-thinker or super-creator. John Lennon and Paul McCartney, of the Beatles, are an iconic example; or Steve Jobs and Steve Wozniak, the founders of Apple Computer; or Thomas Jefferson and James Madison, whose political partnership married the ideas of individual freedom and constitutional order. Though each member of the dyad was exceptionally talented, their combination sparked a chemical reaction that created something fantastic. Around the time he discovered big data sets on happiness, Oswald met his own creativity reagent: David Blanchflower.
See also life satisfaction by age assumptions about equations expectation gap and Graham and Nikolova data on sensitization factor significance of effect unemployment factor compared to ages of man aging depression and anxiety survey on geriatric psychiatry field of successful Golden Years model for outdated model of paradox of physical limits and as social concept social selectivity and stress decline after age fifty survey on successful time and United States and Japan study on wisdom and wisdom and American Association of Retired Persons (AARP) American Beauty The American Interest American Journal of Psychiatry American Psychiatric Association American Psychologist antidepressants anxiety apes, U-curve pattern in Apple Computer Ardelt, Monika Aristotle The Atlantic Authentic Happiness: Using the New Positive Psychology to Realize Your Potential for Lasting Fulfillment (Seligman) average life satisfaction by age adjusted world sample, 2010–2012 unadjusted world sample, 2010–2012 awards baby boomers Bangen, Katherine Barbary macaques Bartolini, Stefano Be Here Now (Ram Dass) Berlin, Isaiah Bhagavad Gita bias.
The Perfect Thing: How the iPod Shuffles Commerce, Culture, and Coolness by Steven Levy
Apple II, British Empire, Claude Shannon: information theory, en.wikipedia.org, indoor plumbing, Internet Archive, Jeff Bezos, John Markoff, Joi Ito, Jony Ive, Kevin Kelly, Sand Hill Road, Saturday Night Live, Silicon Valley, social web, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, technology bubble, Thomas L Friedman
Two things always seem to evoke an indignant outburst of "It ain't natural!" One is drugs; the other is technology, applied so as to please ourselves. When the latter is used to get effects as mind-blowing as the former, things become really interesting. (One of the most memorable quotes I've ever gathered in my reporting career came in 1982, covering the US Festival, a huge rock concert sponsored by Apple cofounder Steve Wozniak. At a motel nearby, Jerry Garcia, who was prepping to play a "Breakfast with the Grateful Dead" set, proclaimed, "Technology is the new drugs." Okay, not an original concept, but consider the source.) Without altering one's chemical composition, the iPod does change your head. Plugging directly into your ears, dominating the Personal brain matter in between, and shuffling your music collection to extract constant delight, it generates a portable alternative reality, almost always more pleasant than the real one.
Years earlier, when he had been in his earlier phase at Apple and working on the Macintosh, I had asked him what he wanted for Apple. "I want us," he said, "to be a ten- Apple billion-dollar company that doesn't lose its soul." Now Apple's revenues had fallen from a high of eight billion dollars to barely five billion. And Jobs would be the first to tell you that the soul level had fallen from Solomon Burke heights to the Lawrence Welk abyss. "I think the world is a slightly better place with Apple Computer," he told me then. "If Apple could return to its roots as an innovator, then the whole industry would benefit from that. When you really look at it, there are two things about Apple that are remarkable. One, Apple owns one of the two high-volume operating systems in the world. Second, Apple is the only company left that makes the whole widget.
Soon before the launch, the first production iPods arrived, ready for the lucky first wave (like me) who would receive them in advance of the thousands that would be snapped up instantly when Apple began selling them to the public in November. Looking back on the process, Jobs waxes philosophical. "If there was ever a product that catalyzed what's Apple's reason for being, it's this," he says. "Because it combines Apple's incredible technology base with Apple's legendary ease of use with Apple's Origin awesome design. Those three things come together in this, and it's Uke, that's what we do. So if anybody was ever wondering why is Apple on the earth, I would hold up this as a good example." A few days after the launch, Jobs threw a celebratory lunch for forty or so of the core people who had worked on the product. He thanked the team not only for making a great product but for taking all of Apple in a new, limitless direction. For the meal, he sat down at a table with a few unfamiliar faces and asked those he didn't recognize who they were and what they did.
The Entrepreneurial State: Debunking Public vs. Private Sector Myths by Mariana Mazzucato
"Robert Solow", Apple II, banking crisis, barriers to entry, Bretton Woods, business cycle, California gold rush, call centre, carbon footprint, Carmen Reinhart, cleantech, computer age, creative destruction, credit crunch, David Ricardo: comparative advantage, demand response, deskilling, endogenous growth, energy security, energy transition, eurozone crisis, everywhere but in the productivity statistics, Financial Instability Hypothesis, full employment, G4S, Growth in a Time of Debt, Hyman Minsky, incomplete markets, information retrieval, intangible asset, invisible hand, Joseph Schumpeter, Kenneth Rogoff, Kickstarter, knowledge economy, knowledge worker, natural language processing, new economy, offshore financial centre, Philip Mirowski, popular electronics, profit maximization, Ralph Nader, renewable energy credits, rent-seeking, ride hailing / ride sharing, risk tolerance, shareholder value, Silicon Valley, Silicon Valley ideology, smart grid, Steve Jobs, Steve Wozniak, The Wealth of Nations by Adam Smith, Tim Cook: Apple, too big to fail, total factor productivity, trickle-down economics, Washington Consensus, William Shockley: the traitorous eight
While the products owe their beautiful design and slick integration to the genius of Jobs and his large team, nearly every state-of-the-art technology found in the iPod, iPhone and iPad is an often overlooked and ignored achievement of the research efforts and funding support of the government and military. Only about a decade ago Apple was best known for its innovative personal computer design and production. Established on 1 April 1976 in Cupertino, California by Steve Jobs, Steve Wozniak and Ronald Wayne, Apple was incorporated in 1977 by Jobs and Wozniak to sell the Apple I personal computer.1 The company was originally named Apple Computer, Inc. and for 30 years focused on the production of personal computers. On 9 January 2007, the company announced it was removing the ‘Computer’ from its name, reflecting its shift in focus from personal computers to consumer electronics. This same year, Apple launched the iPhone and iPod Touch featuring its new mobile operating system, iOS, which is now used in other Apple products such as the iPad and Apple TV. Drawing on many of the technological capabilities of earlier generations of the iPod, the iPhone (and iPod Touch) featured a revolutionary multi-touch screen with a virtual keyboard as part of its new operating system.
Drawing on many of the technological capabilities of earlier generations of the iPod, the iPhone (and iPod Touch) featured a revolutionary multi-touch screen with a virtual keyboard as part of its new operating system. Table 3. Apple’s net sales, income and R&D figures between 1999 and 2011 (US$, millions) Note: Apple’s annual net sales, income and R&D figures were obtained from company’s annual SEC 10-K filings. Figure 10. Apple net sales by region and product (US$, billions) While Apple achieved notable success during its 30-year history by focusing on personal computers, the success and popularity of its new iOS products has far exceeded any of its former achievements in personal computing.2 In the 5-year period following the launch of the iPhone and iPod Touch in 2007, Apple’s global net sales increased nearly 460 per cent. As Table 3 illustrates, the new iOS product line represented nearly 70 per cent of the overall net sales of Apple in 2011. The success and popularity of Apple’s new products was quickly reflected in the company’s revenues.
If one takes this figure and then adds the estimated 210,000 jobs that are focused on developing mobile applications for the Apple Store, the aggregate total is estimated at 514,000 jobs that are either created or enabled/supported by Apple (Apple 2012). Apple bases its claims on a report developed by the Analysis Group, a private consulting firm Apple hired to study its impact in the job market.1 The attention to these numbers stems largely from the ongoing debate regarding whether or not technology companies have been contributing to overall job creation within the domestic manufacturing sector. Apple directly employs individuals in 47,000 jobs out of the total 304,000 that the company claims; over 27,000 jobs are employed within the 246 Apple Stores located in 44 US states. The company does not reveal exactly what portion of the 304,000 figure includes manufacturing jobs specifically (or those jobs created by overseas manufacturers such as Foxconn).
Late Bloomers: The Power of Patience in a World Obsessed With Early Achievement by Rich Karlgaard
Airbnb, Albert Einstein, Amazon Web Services, Apple's 1984 Super Bowl advert, Bernie Madoff, Bob Noyce, Brownian motion, Captain Sullenberger Hudson, cloud computing, cognitive dissonance, Daniel Kahneman / Amos Tversky, deliberate practice, Electric Kool-Aid Acid Test, Elon Musk, en.wikipedia.org, experimental economics, fear of failure, financial independence, follow your passion, Frederick Winslow Taylor, hiring and firing, Internet of things, Isaac Newton, Jeff Bezos, job satisfaction, knowledge economy, labor-force participation, longitudinal study, low skilled workers, Mark Zuckerberg, meta analysis, meta-analysis, Moneyball by Michael Lewis explains big data, move fast and break things, move fast and break things, pattern recognition, Peter Thiel, Sand Hill Road, science of happiness, shareholder value, Silicon Valley, Silicon Valley startup, Snapchat, Steve Jobs, Steve Wozniak, theory of mind, Tim Cook: Apple, Toyota Production System, unpaid internship, upwardly mobile, women in the workforce, working poor
Early on, he conducted job interviews himself; and he too asked candidates for their SAT scores. Bezos said: ‘Every time we hire someone, he or she should raise the bar for the next hire, so that the overall talent pool is always improving.’ ” Bezos jokingly told a reporter that his wife’s high SAT scores made the pair compatible. Facebook founder Mark Zuckerberg scored a perfect 1600 on the math and English parts of his SAT. And Apple cofounder Steve Wozniak? He, too, scored an 800 on his math SAT. Two more eye-opening numbers: The combined personal wealth of our six wunderkinds math SAT test takers is more than $300 billion. And the companies they created are worth $3.6 trillion, more than the GDP of all but nine countries. Surely this is worthy of applause. But let’s stop and consider the price. One well-known contrarian thinker argues that today’s obsession with SAT scores and the wealth-generating mojo of algorithmic genius has left the U.S. economy in a decades-long state of underperformance and a society desperate for answers.
Exceptionally gifted, Holmes was also exceptionally driven. Her business hero was Steve Jobs, the prodigy who cofounded Apple and later led it to glory. She quickly adopted Jobs’s tropes and mannerisms. She wore black turtlenecks. She steepled her hands. Her slow-blinking stare, it was said, could bore holes through your eye sockets. She also took on Jobs’s less admirable traits. She ran Theranos like a police state, obsessed with preventing employees from talking about their work with each other. She became a master at employing Jobs’s “reality distortion field”—a fictional narrative about her own genius and wondrous Theranos products that were unyielding to the facts. Jobs was only twenty-one when he cofounded Apple, and twenty-five when Apple first sold its stock to the public, making Jobs a young celebrity and centi-millionaire.
Once again, we retain that capability for much longer than previously thought. In 2008, Hector Zenil, who coleads the Algorithmic Dynamics Lab at the Karolinska Institute in Sweden, studied 3,400 people between the ages of four and ninety-one on their ability to behave randomly. The idea is that random thinking—seeing beyond the obvious—is connected to creative thinking. When an apple falls from a tree, the creative person doesn’t simply think that apple must have been ripe; like Isaac Newton, she sees the apple fall and pictures the invisible force of gravity. How did Hector Zenil and researchers test for random thinking? They developed five short “random item generation” tasks performed on a computer, including twelve simulated coin flips, ten simulated rolls of a die, and arranging boxes on a grid. The test taker’s job was to make their sequence of answers appear as unpredictable as possible to a logical computer program.
Running Money by Andy Kessler
Andy Kessler, Apple II, bioinformatics, Bob Noyce, British Empire, business intelligence, buy and hold, buy low sell high, call centre, Corn Laws, Douglas Engelbart, family office, full employment, George Gilder, happiness index / gross national happiness, interest rate swap, invisible hand, James Hargreaves, James Watt: steam engine, joint-stock company, joint-stock limited liability company, knowledge worker, Leonard Kleinrock, Long Term Capital Management, mail merge, Marc Andreessen, margin call, market bubble, Maui Hawaii, Menlo Park, Metcalfe’s law, Mitch Kapor, Network effects, packet switching, pattern recognition, pets.com, railway mania, risk tolerance, Robert Metcalfe, Sand Hill Road, Silicon Valley, South China Sea, spinning jenny, Steve Jobs, Steve Wozniak, Toyota Production System, zero-sum game
He quickly pulled up his sleeve and pointed to a Microma watch on his wrist and told me he wore it often to remind himself to never be that stupid again. Intel’s lesson: make the intellectual property, not the end product. The cool thing about a computer on a chip is you can start a computer company without knowing much about computers. Steve Jobs and Steve Wozniak created Apple Computer without knowing that much. Wozniak had to write some software to get data on and off a ﬂoppy disk drive, which no one else had, and their Apple I became a hit. IBM knew lots about how to milk big bucks out of big computers, but nothing about microprocessors. So a stealth group in Florida contracted out the work, creating a Frankensteinlike IBM PC in 1981, using an Intel microprocessor, Microsoft software and a Western Digital disk controller. Design and manufacture were now separated in the computer business too
See Advanced Micro Devices American Federation of Information Processing Societies, Fall Joint Computer Conference (1968), 119–20, 123 America Online. See AOL Andreessen, Marc, 197, 199 animation, 134–35 AOL (America Online), 69–73, 207, 208, 223, 290 Cisco routers and, 199 Inktomic cache software and, 143 Netscape Navigator purchase, 201, 225 Telesave deal, 72–73 TimeWarner deal, 223, 229 as top market cap company, 111 Apache Web server, 247 Apple Computer, 45, 127, 128 Apple II, 183 Applied Materials, 245 Archimedes (propeller ship), 94 Arkwright, Richard, 65 ARPANET, 186, 187, 189, 191 Arthur Andersen, 290 Artists and Repertoire (A&R), 212, 216 Asian debt crisis, 3, 150, 151, 229, 260 yen and, 162–65, 168, 292 @ (at sign), 187 AT&T, 61, 185–86, 189 August Capital, 2, 4 auto industry, 267–68 Aziz, Tariq, 26 Babbage, Charles, 93 Baker, James, 26 Balkanski, Alex, 44, 249 bandwidth, 60, 111, 121, 140, 180, 188–89 Baran, Paul, 184, 185 Barbados, 251, 254 300 Index Barksdale, Jim, 198, 199–201 Barksdale Group, 201 BASE, 249 BASIC computer language, 126, 127 BBN.
You remember my suggestion for C-Cube.” “Of course, you told me that a salesman or broker has 30 seconds and three bullet points to pitch our deal. So we need to provide that in our positioning.” “Right. But I ﬁgured out that the bullet points are always the same.” “Always?” “Sure. Bullet one is a large market, as Don Valentine says.” Don Valentine ran Sequoia Ventures and ﬁrst made his mark funding Apple Computer. His golden touch didn’t stop—he funded Cisco and Sierra Semiconductor. He was also chairman of C-Cube and sitting across the table. “Bullet two is an unfair competitive advantage, and bullet three is a business model leveraging that unfair advantage. I just ﬁll in the details company by company.” “It’s ‘monster market,’ ” Don Valentine threw in. Shit, I didn’t realize he was listening.
Everything for Everyone: The Radical Tradition That Is Shaping the Next Economy by Nathan Schneider
1960s counterculture, Affordable Care Act / Obamacare, Airbnb, altcoin, Amazon Mechanical Turk, back-to-the-land, basic income, Berlin Wall, Bernie Sanders, bitcoin, blockchain, Brewster Kahle, Burning Man, Capital in the Twenty-First Century by Thomas Piketty, carbon footprint, Clayton Christensen, collaborative economy, collective bargaining, Community Supported Agriculture, corporate governance, creative destruction, crowdsourcing, cryptocurrency, Debian, disruptive innovation, do-ocracy, Donald Knuth, Donald Trump, Edward Snowden, Elon Musk, Ethereum, ethereum blockchain, Food sovereignty, four colour theorem, future of work, gig economy, Google bus, hydraulic fracturing, Internet Archive, Jeff Bezos, jimmy wales, joint-stock company, Joseph Schumpeter, Julian Assange, Kickstarter, Lyft, M-Pesa, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, mass immigration, means of production, multi-sided market, new economy, offshore financial centre, old-boy network, Peter H. Diamandis: Planetary Resources, post-work, precariat, premature optimization, pre–internet, profit motive, race to the bottom, Richard Florida, Richard Stallman, ride hailing / ride sharing, Sam Altman, Satoshi Nakamoto, self-driving car, shareholder value, sharing economy, Silicon Valley, Slavoj Žižek, smart contracts, Steve Jobs, Steve Wozniak, Stewart Brand, transaction costs, Turing test, Uber and Lyft, uber lyft, underbanked, undersea cable, universal basic income, Upton Sinclair, Vanguard fund, white flight, Whole Earth Catalog, WikiLeaks, women in the workforce, working poor, Y Combinator, Y2K, Zipcar
They had their own publication, the People’s Computer Company Newsletter, and a mostly theoretical network (with only one actual node, in a Berkeley record store) called Community Memory. Their propaganda described the computer as a “radical social artifact” that would usher in a “direct democracy of information”—“actively free (‘open’) information,” of course.5 This was the culture out of which arose such icons as Steve Wozniak, inventor of the Apple computer, and the Whole Earth Catalog, which hyped the digital revolution with all that newsprint and mail-order could muster. Like the unMonastery, these guerrilla hackers blended the old with the new, the ancient with the postindustrial. Although their projects often relied on state or corporate subsidies, they envisioned their efforts as apolitical, wrapped in the “safe neutrality” of information, as Roszak put it.
Otherwise, democracy becomes a spectator sport—as real, and yet as out of reach, as reality TV. When tech people talk about “democratizing” something, like driving directions or online banking, what they really mean is access. Access is fine, but it’s just access. It’s a drive-through window, not a door. Access is only part of what democracy has always entailed—alongside real ownership, governance, and accountability. Democracy is a process, not a product. Apple’s Orwell-themed 1984 Super Bowl commercial presented the personal computer as a hammer in the face of Big Brother; later that year, after Election Day, the company printed an ad in Newsweek that proposed “the principle of democracy as it applies to technology”: “One person, one computer.” The best-selling futurist handbook of the same period, John Naisbitt’s Megatrends, likewise promised that “the computer will smash the pyramid,” and with its networks “we can restructure our institutions horizontally.”16 What we’ve gotten instead are apps from online monopolies accountable to their almighty stock tickers.
More than reducing the literal tempo, the slow-food movement has sought values-oriented economies and thicker communities. Similarly, though my computer is for the most part very speedy and able, slow computing means slowing down enough to compute in community. This turns annoyances into pleasures, just as one learns to appreciate unhygienic soil, disorderly farmers’ markets, and inconvenient seasons. The software I use now lacks the veneer of flawlessness that Apple products provide; it is quite clearly a work in progress, forever under construction by programmers who notice a need and share their fixes with everyone. But early on, I found that the glitches felt different than they used to. What would have driven me crazy on a MacBook didn’t upset me anymore. No longer could I curse some abstract corporation somewhere. With community-made software, there’s no one to blame but us.
Everything Is Obvious: *Once You Know the Answer by Duncan J. Watts
active measures, affirmative action, Albert Einstein, Amazon Mechanical Turk, Black Swan, business cycle, butterfly effect, Carmen Reinhart, Cass Sunstein, clockwork universe, cognitive dissonance, coherent worldview, collapse of Lehman Brothers, complexity theory, correlation does not imply causation, crowdsourcing, death of newspapers, discovery of DNA, East Village, easy for humans, difficult for computers, edge city, en.wikipedia.org, Erik Brynjolfsson, framing effect, Geoffrey West, Santa Fe Institute, George Santayana, happiness index / gross national happiness, high batting average, hindsight bias, illegal immigration, industrial cluster, interest rate swap, invention of the printing press, invention of the telescope, invisible hand, Isaac Newton, Jane Jacobs, Jeff Bezos, Joseph Schumpeter, Kenneth Rogoff, lake wobegon effect, Laplace demon, Long Term Capital Management, loss aversion, medical malpractice, meta analysis, meta-analysis, Milgram experiment, natural language processing, Netflix Prize, Network effects, oil shock, packet switching, pattern recognition, performance metric, phenotype, Pierre-Simon Laplace, planetary scale, prediction markets, pre–internet, RAND corporation, random walk, RFID, school choice, Silicon Valley, social intelligence, statistical model, Steve Ballmer, Steve Jobs, Steve Wozniak, supply-chain management, The Death and Life of Great American Cities, the scientific method, The Wisdom of Crowds, too big to fail, Toyota Production System, ultimatum game, urban planning, Vincenzo Peruggia: Mona Lisa, Watson beat the top human players on Jeopardy!, X Prize
To illustrate this problem, let’s step away from bankers for a moment and ask a less-fashionable question: To what extent should Steve Jobs, founder and CEO of Apple Inc., be credited with Apple’s recent success? Conventional wisdom holds that he is largely responsible for it, and not without reason. Since Jobs returned in the late 1990s to lead the company that he founded in 1976 with Steve Wozniak in a Silicon Valley garage, its fortunes have undergone a dramatic resurgence, producing a string of hit products like the iMac, the iPod, and the iPhone. As of the end of 2009, Apple had outperformed the overall stock market and its industry peers by about 150 percent over the previous six years, and in May 2010 Apple overtook Microsoft to become the most valuable technology company in the world. During all this time, Jobs has reportedly received neither a salary nor a cash bonus—his entire compensation has been in Apple stock.20 It’s a compelling story, and the list of Apple’s successes is long enough that it’s hard to believe it’s all due to chance.
Surprisingly, the company that “got it right” in the music industry was Apple, with their combination of the iPod player and their iTunes store. In retrospect, Apple’s strategy looks visionary, and analysts and consumers alike fall over themselves to pay homage to Apple’s dedication to design and quality. Yet the iPod was exactly the kind of strategic play that the lessons of Betamax, not to mention Apple’s own experience in the PC market, should have taught them would fail. The iPod was large and expensive. It was based on closed architecture that Apple refused to license, ran on proprietary software, and was actively resisted by the major content providers. Nevertheless, it was a smashing success. So in what sense was Apple’s strategy better than Sony’s? Yes, Apple had made a great product, but so had Sony. Yes, they looked ahead and did their best to see which way the technological winds were blowing, but so did Sony.
As with all explanations that depend on the known outcome to account for why a particular strategy was good or bad, the conventional wisdom regarding Apple’s recent success is vulnerable to the Halo Effect. Quite aside from the Halo Effect, however, there is another potential problem with the conventional wisdom about Apple. And that is our tendency to attribute the lion’s share of the success of an entire corporation, employing tens of thousands of talented engineers, designers, and managers to one individual. As with all commonsense explanations, the argument that Steve Jobs is the irreplaceable architect of Apple’s success is entirely plausible. Not only did Apple’s turnaround begin with Jobs’s return, after a decade of exile, from 1986 to 1996, but his reputation as a fiercely demanding manager with a relentless focus on innovation, design, and engineering excellence would seem to draw a direct line between his approach to leadership and Apple’s success.
The Pirate's Dilemma by Matt Mason
"side hustle", Albert Einstein, augmented reality, barriers to entry, citizen journalism, creative destruction, don't be evil, Donald Trump, Douglas Engelbart, East Village, Firefox, future of work, glass ceiling, global village, Hacker Ethic, haute couture, Howard Rheingold, Internet of things, invisible hand, Isaac Newton, jimmy wales, job satisfaction, John Markoff, Joseph Schumpeter, Kickstarter, Lao Tzu, Marshall McLuhan, means of production, Naomi Klein, new economy, New Urbanism, patent troll, peer-to-peer, prisoner's dilemma, RAND corporation, RFID, Richard Florida, Richard Stallman, SETI@home, Silicon Valley, South China Sea, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Tim Cook: Apple, urban sprawl, Whole Earth Catalog
They met in the garage of French’s home in San Mateo County, California, to *In 1971 Moore unwillingly had $20,000 of seed money bestowed on him, which freaked him out so much that he buried it in his backyard. Boundaries | 145 ponder the future of computing, using new technology such as the Altair kit. Here the idea that became the personal computer was formulated. The club’s members included a college dropout who occasionally dropped acid named Steve Jobs, and his future Apple cofounder, Steve Wozniak. They remixed early programs, ﬁxing and debugging them, publishing their ﬁndings in a regular newsletter, and recruiting more members along the way. Like disco, computer software was something of a loose-knit, collaborative effort with an open social structure. And like disco, it would change completely once it went commercial. In 1976, a twenty-one-year-old programmer (another college dropout rumored to have dabbled with LSD) wrote the Homebrew Computer Club an angry letter, saying the club couldn’t use his software anymore, a program called BASIC, without paying for it.
In May 2006 he ﬁnanced a lawsuit that overturned New York legislation that banned anyone age eighteen to twenty from buying or carrying spray paint or broad-tipped marker pens, which effectively prohibited many art students from buying supplies for school, or even carrying them to class. Judge George B. Daniels granted a temporary injunction against the law, arguing, “That’s like telling me I can eat an apple, but I can’t buy an apple, no one can sell me an apple and I can’t bring it to work for lunch.” 110 | THE PIRATE’S DILEMMA hoax was spectacular* and his ongoing defense of street art and free speech is admirable, but some critics see nothing more than thinly veiled PR stunts for his multinational corporation. The one thing both artists proved for certain is that the media afterlife of a piece of grafﬁti can be way more powerful than its temporary terrestrial body.
If we also consider that over 800 million mobile phones are sold every year, which will all be able to receive music, and study the growth of new technologies and their application in the last century, the industry should be conﬁdent.” The jury is still out on Jenner’s solution, but EMI and Universal became the ﬁrst major labels to begin selling MP3s without DRM encryption in 2007. That February, Apple’s Steve Jobs made a plea to all the record companies to abolish DRM completely, saying it was “clearly the best alternative for consumers, and Apple would embrace it in a heartbeat.” John Kennedy, Boundaries | 161 the CEO of the IFPI (the International Federation of Phonograph Industries), summed up the music industry’s new position on the Pirate’s Dilemma it faced: “At long last the threat has become the opportunity.” The sad truth is that it was an opportunity all along.
The Launch Pad: Inside Y Combinator, Silicon Valley's Most Exclusive School for Startups by Randall Stross
affirmative action, Airbnb, AltaVista, always be closing, Amazon Mechanical Turk, Amazon Web Services, barriers to entry, Ben Horowitz, Burning Man, business cycle, California gold rush, call centre, cloud computing, crowdsourcing, don't be evil, Elon Musk, high net worth, index fund, inventory management, John Markoff, Justin.tv, Lean Startup, Marc Andreessen, Mark Zuckerberg, medical residency, Menlo Park, Minecraft, minimum viable product, Paul Buchheit, Paul Graham, Peter Thiel, QR code, Richard Feynman, Richard Florida, ride hailing / ride sharing, Sam Altman, Sand Hill Road, side project, Silicon Valley, Silicon Valley startup, Skype, social graph, software is eating the world, South of Market, San Francisco, speech recognition, Stanford marshmallow experiment, Startup school, stealth mode startup, Steve Jobs, Steve Wozniak, Steven Levy, TaskRabbit, transaction costs, Y Combinator
“Do you ever, in your work, say, ‘Boy, I wish somebody would just—blank’?” asks Graham. He tries another tack, asking about areas in which they are “domain experts,” the way Google’s founders were experts in search. “We all sort of collected our strengths and things we’re good at,” Shen says. “It’s the most random things.” Graham offers another inspiring tale about a startup that addressed an unmet need: Apple, which was started by Steve Wozniak because he wanted his own computer. “He couldn’t afford the components. So he designed computers on paper. And then DRAMs came along—chips became just cheap enough that he could build a computer.” When Steve Jobs saw what Wozniak had built, he suggested that they sell it to other people. Looking back, the need seems obvious. Graham suggests that they ask themselves: “What will people say in the future was an unmet need today?”
AARRR was an acronym for Acquisition, Activation, Retention, Referral, and Revenue. CHAPTER 14: RISK 1. “Where Are They Now: Ralston Shepherds Yahoo E-mail from Free to Paid,” MarketWatch, September 20, 2002, www.marketwatch.com/story/the-man-in-charge-of-yahoo-e-mail-shares-his-vision. 2. Apple did not rebrand Lala’s service; it shut the service down the next year. Peter Kafka, “Apple Pulls the Plug on Lala, Replaces It with . . . Nada,” AllThingsD, June 1, 2010, http://allthingsd.com/20100601/apple-pulls-the-plug-on-lala-replaces-it-with-nothing/. 3. PG, “Imagine K12,” YC Web site, March 17, 2011, http://ycombinator.com/imaginek12.html. 4. Ralston was formally appointed a YC partner early the next year. “Welcome Geoff,” YC Posterous, January 27, 2012, http://ycombinator.posterous.com/welcome-geoff. 5.
Artificial Intelligence: A Guide for Thinking Humans by Melanie Mitchell
Ada Lovelace, AI winter, Amazon Mechanical Turk, Apple's 1984 Super Bowl advert, artificial general intelligence, autonomous vehicles, Bernie Sanders, Claude Shannon: information theory, cognitive dissonance, computer age, computer vision, dark matter, Douglas Hofstadter, Elon Musk, en.wikipedia.org, Gödel, Escher, Bach, I think there is a world market for maybe five computers, ImageNet competition, Jaron Lanier, job automation, John Markoff, John von Neumann, Kevin Kelly, Kickstarter, license plate recognition, Mark Zuckerberg, natural language processing, Norbert Wiener, ought to be enough for anybody, pattern recognition, performance metric, RAND corporation, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, Rodney Brooks, self-driving car, sentiment analysis, Silicon Valley, Singularitarianism, Skype, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, theory of mind, There's no reason for any individual to have a computer in his home - Ken Olsen, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!
The player’s goal is to maximize the score over the five lives. There’s an interesting side note here. Breakout was the result of Atari’s effort to create a single-player version of its successful game Pong. The design and implementation of Breakout were originally assigned in 1975 to a twenty-year-old employee named Steve Jobs. Yes, that Steve Jobs (later, cofounder of Apple). Jobs lacked sufficient engineering skills to do a good job on Breakout, so he enlisted his friend Steve Wozniak, aged twenty-five (later, the other cofounder of Apple), to help on the project. Wozniak and Jobs completed the hardware design of Breakout in four nights, starting work each night after Wozniak had completed his day job at Hewlett-Packard. Once released, Breakout, like Pong, was hugely popular among gamers. If you’re getting nostalgic but neglected to hang on to your old Atari 2600 game console, you can still find many websites offering Breakout and other games.
Thus the idea for ImageNet was born. Li and her collaborators soon commenced collecting a deluge of images by using WordNet nouns as queries on image search engines such as Flickr and Google image search. However, if you’ve ever used an image search engine, you know that the results of a query are often far from perfect. For example, if you type “macintosh apple” into Google image search, you get photos not only of apples and Mac computers but also of apple-shaped candles, smartphones, bottles of apple wine, and any number of other nonrelevant items. Thus, Li and her colleagues had to have humans figure out which images were not actually illustrations of a given noun and get rid of them. At first, the humans who did this were mainly undergraduates. The work was agonizingly slow and taxing. Li soon figured out that at the rate they were going, it would take ninety years to complete the task.4 Li and her collaborators brainstormed about possible ways to automate this work, but of course the problem of deciding if a photo is an instance of a particular noun is the task of object recognition itself!
In 2012, at the same time that deep learning was revolutionizing computer vision, a landmark paper on speech recognition was published by research groups at the University of Toronto, Microsoft, Google, and IBM.2 These groups had been developing deep neural networks for various aspects of speech recognition: recognizing phonemes from acoustic signals, predicting words from combinations of phonemes, predicting phrases from combinations of words, and so on. According to a Google speech-recognition expert, the use of deep networks resulted in the “biggest single improvement in 20 years of speech research.”3 The same year, a new deep-network speech-recognition system was released to customers on Android phones; two years later it was released on Apple’s iPhone, with one Apple engineer commenting, “This was one of those things where the jump [in performance] was so significant that you do the test again to make sure that somebody didn’t drop a decimal place.”4 If you yourself happened to use any kind of speech-recognition technology both before and after 2012, you will have also noticed a very sharp improvement. Speech recognition, which before 2012 ranged from horribly frustrating to moderately useful, suddenly became very nearly perfect in some circumstances.
Present Shock: When Everything Happens Now by Douglas Rushkoff
algorithmic trading, Andrew Keen, bank run, Benoit Mandelbrot, big-box store, Black Swan, British Empire, Buckminster Fuller, business cycle, cashless society, citizen journalism, clockwork universe, cognitive dissonance, Credit Default Swap, crowdsourcing, Danny Hillis, disintermediation, Donald Trump, double helix, East Village, Elliott wave, European colonialism, Extropian, facts on the ground, Flash crash, game design, global pandemic, global supply chain, global village, Howard Rheingold, hypertext link, Inbox Zero, invention of agriculture, invention of hypertext, invisible hand, iterative process, John Nash: game theory, Kevin Kelly, laissez-faire capitalism, lateral thinking, Law of Accelerating Returns, loss aversion, mandelbrot fractal, Marshall McLuhan, Merlin Mann, Milgram experiment, mutually assured destruction, negative equity, Network effects, New Urbanism, Nicholas Carr, Norbert Wiener, Occupy movement, passive investing, pattern recognition, peak oil, price mechanism, prisoner's dilemma, Ralph Nelson Elliott, RAND corporation, Ray Kurzweil, recommendation engine, selective serotonin reuptake inhibitor (SSRI), Silicon Valley, Skype, social graph, South Sea Bubble, Steve Jobs, Steve Wozniak, Steven Pinker, Stewart Brand, supply-chain management, the medium is the message, The Wisdom of Crowds, theory of mind, Turing test, upwardly mobile, Whole Earth Catalog, WikiLeaks, Y2K, zero-sum game
Reinforcing self-similarity on every level, a network of people is better at mapping connections than a lone individual. As author and social critic Steven Johnson would remind us, ideas don’t generally emerge from individuals, but from groups, or what he calls “liquid networks.”1 The coffeehouses of eighteenth-century London spawned the ideas that fueled the Enlightenment, and the Apple computer was less a product of one mind than the collective imagination of the Homebrew Computer Club to which both Steve Jobs and Steve Wozniak belonged. The notion of a lone individual churning out ideas in isolated contemplation like Rodin’s Thinker may not be completely untrue, but it has certainly been deemphasized in today’s culture of networked thinking. As we become increasingly linked to others and dependent on making connections in order to make sense, these new understandings of how ideas emerge are both helpful and reassuring.
The expectations for instant reward and satisfaction have been built up by media for close to a century now. The amount of time between purchase (or even earning) and gratification has shrunk to nothing—so much so that the purchase itself is more rewarding than consuming whatever it is that has been bought. After waiting several days in the street, Apple customers exit the store waving their newly purchased i-gadgets in the air, as if acquisition itself were the reward. The purchase feels embedded with historicity. They were part of a real moment, a specific date. The same way someone may tell us he was at the Beatles famous concert at Shea Stadium, the Apple consumer can say he scored the new iPhone on the day it was released. Where “act now” once meant that a particular sales price would soon expire, today it simply means there’s an opportunity to do something at a particular moment. To be a part of something.
Knowledge had handicapped the hedgehogs, while the wide-ranging curiosity of the foxes gave them the edge. Now, on the surface this sounds like the failing of the fractalnoids—those economists who want to equate the properties of plankton with the personalities of Parisians. But it’s fundamentally different in that it’s human beings applying patterns intuitively to different systems, not the frantic confusion of apples and oranges or, more likely, apples with planets. Yes, it is still fraught with peril, but it’s also a rich competency to develop in an era of present shock. For instance, I still don’t know whether to be delighted or horrified by the student who told me he “got the gist” of Hamlet by skimming it in a couple of minutes and reading a paragraph of critique on Wikipedia. The student already saw the world in fractal terms and assumed that being able to fully grasp one moment of Hamlet would mean he had successfully “grokked” the whole.
Utopias: A Brief History From Ancient Writings to Virtual Communities by Howard P. Segal
1960s counterculture, British Empire, Buckminster Fuller, complexity theory, David Brooks, death of newspapers, dematerialisation, deskilling, energy security, European colonialism, Francis Fukuyama: the end of history, full employment, future of journalism, G4S, garden city movement, germ theory of disease, Golden Gate Park, invention of the printing press, Isaac Newton, Jeff Bezos, John Markoff, John von Neumann, knowledge economy, liberation theology, Louis Pasteur, Mark Zuckerberg, mass immigration, means of production, Nelson Mandela, Nicholas Carr, Nikolai Kondratiev, out of africa, Ralph Waldo Emerson, Ray Kurzweil, Ronald Reagan, Silicon Valley, Skype, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Stewart Brand, technoutopianism, Thomas Malthus, Thorstein Veblen, transcontinental railway, traveling salesman, union organizing, urban planning, War on Poverty, Whole Earth Catalog
53 It would not be much of a challenge to elevate him to overnight celebrity status in the manner, common today, of inﬁnitely less accomplished persons in countless realms. (Robert Noyce also invented the integrated circuit but worked separately from Kilby; however, because he was already dead, he was ineligible for the prize.) There is, however, a bit of historical reverence in Silicon Valley for the three garages in which three major high-tech companies began: Hewlett-Packard (William Hewlett and David Packard, starting in 1938), Apple (Steve Jobs and Steve Wozniak, starting in 1976), and Google (Sergey Brin and Larry Page, starting in 1998). None is open to the public, but all are popular brief stopping places for contemporary geek tours.54 More generally, there is a declining conﬁdence in scientiﬁc and technological panaceas—not simply a declining faith in utopia. This was a process that began in Europe during World War I as Victorian conﬁdence and complacency were shattered by the tragic futility of trench warfare and by the use of terrible new weapons of mass destruction such as machine guns, tanks, airplanes, and poison gas.
There have since been improved versions, including a larger-size Kindle that now allows easier access to selected newspapers that were difﬁcult to read in the earlier Kindle. Meanwhile, rivals to the Kindle have inevitably come about, above all the Apple iPad, released in 2010. This tablet computer offers not only books and periodicals but also games, movies, music, and the Internet. Its origins can be traced back to Apple’s ﬁrst tablet computer, the Newton Message Pad of 1993 (discontinued in 1998), named, of course, after Isaac Newton. Like Apple’s extremely popular iPhone and iPod Touch, the iPad uses a multi-touch ﬁnger-sensitive touchscreen—a vast improvement over the pressure-triggered stylus used in previous tablet computers. Studies of human interaction with these devices indicate that touchscreens have become integral components of our daily lives much sooner than other “technological behaviors” because they The Resurgence of Utopianism 219 are “so natural, intimate, and intuitive.”
First came audiobooks, which brought new or renewed attention to thousands of books through enjoyment of them being read by either the authors themselves, professional narrators, or actors. More recently, books and other printed materials have increasingly been revived in the very electronic form that was predicted to be their downfall. Consider podcasts, for example. In 2004 Ben Hammersley coined the term “podcast”—a mixture of “iPod” and “broadcast.” Apple Computer originated the brand of portable media player that ﬁrst used podcasts, calling it the Apple iPod. 218 The Resurgence of Utopianism A podcast consists of digital media ﬁles that are transferred from the Internet to a computer, iPod, smart phone, or other media player. Podcasts of newspaper, magazine, and journal articles have now become routine.68 There is a growing audience of those willing and often eager to listen to podcasts and to read online versions of books, newspapers, magazines, and journals.
The Seventh Sense: Power, Fortune, and Survival in the Age of Networks by Joshua Cooper Ramo
Airbnb, Albert Einstein, algorithmic trading, barriers to entry, Berlin Wall, bitcoin, British Empire, cloud computing, crowdsourcing, Danny Hillis, defense in depth, Deng Xiaoping, drone strike, Edward Snowden, Fall of the Berlin Wall, Firefox, Google Chrome, income inequality, Isaac Newton, Jeff Bezos, job automation, Joi Ito, market bubble, Menlo Park, Metcalfe’s law, Mitch Kapor, natural language processing, Network effects, Norbert Wiener, Oculus Rift, packet switching, Paul Graham, price stability, quantitative easing, RAND corporation, recommendation engine, Republic of Letters, Richard Feynman, road to serfdom, Robert Metcalfe, Sand Hill Road, secular stagnation, self-driving car, Silicon Valley, Skype, Snapchat, social web, sovereign wealth fund, Steve Jobs, Steve Wozniak, Stewart Brand, Stuxnet, superintelligent machines, technological singularity, The Coming Technological Singularity, The Wealth of Nations by Adam Smith, too big to fail, Vernor Vinge, zero day
Tricking the phone and hearing the system’s clicks gave a sense of secret access, a feeling of control in the largest network on earth. At one point, a phreaker named John Draper figured out that the little plastic whistles stuffed as children’s toys inside boxes of sugary Cap’n Crunch cereal produced the 2,600-hertz tone nearly perfectly. The hack made him a legend, and he became known, inevitably, as Cap’n Crunch. An article about Draper in Esquire in 1971 had inspired two teenagers named Steve Jobs and Steve Wozniak to start their first company in order to build and sell little blue phreaking boxes. Woz later recalled nervously meeting the Cap’n one day in California. He was a strange, slightly smelly, and extremely intense nomadic engineer. “I do it for one reason and one reason only,” the Cap’n huffed to the writer of that Esquire article, who was a bit baffled as to why a grown man would find whistling into phones so appealing.
You might ask: What drew tens of millions of people to watch as Steve Jobs, live, unveiled some new Apple device? Of course, partly it was the cool technology, the warm charisma of the man. But something else was at work, I think. What Jobs was unveiling atop those black stages over the years as we waited for him was nothing less than whole new worlds, connected landscapes that emerged entirely from ideas Apple was secretly developing. He wasn’t merely introducing a phone; he was changing how we were going to experience life. “Every once in a while, a revolutionary product comes along that changes everything,” Jobs began in his famous speech introducing the first iPhone, in 2007. “In 1984 we introduced the Macintosh. It didn’t just change Apple. It changed the whole computer industry. In 2001 we introduced the first iPod.
In 2001 we introduced the first iPod. It didn’t just change the way we all listen to music. It changed the entire music industry.” Apple devices were cracking open paths to whole new worlds in this sense. The company develops an app for podcasts; a new media form is born. It builds an architecture for video calling; our relations to each other deepen a bit. What Jobs was presenting were new and—until that very instant—unimagined universes of possibility that we would all explore. No wonder the world tuned in. Power pulses through structure as molten metal might pass hot into a mold, leaving behind something solid and hard to snap—forms for politics, wealth, and influence. The Orientalist scholar Karl Wittfogel followed this link between form and power as he developed his famous “hydraulic hypothesis” in the 1930s. Ancient agrarian societies such as Egypt and China were formed not least by the need for large-scale irrigation.
Creative Intelligence: Harnessing the Power to Create, Connect, and Inspire by Bruce Nussbaum
3D printing, Airbnb, Albert Einstein, Berlin Wall, Black Swan, Chuck Templeton: OpenTable:, clean water, collapse of Lehman Brothers, creative destruction, Credit Default Swap, crony capitalism, crowdsourcing, Danny Hillis, declining real wages, demographic dividend, disruptive innovation, Elon Musk, en.wikipedia.org, Eugene Fama: efficient market hypothesis, Fall of the Berlin Wall, follow your passion, game design, housing crisis, Hyman Minsky, industrial robot, invisible hand, James Dyson, Jane Jacobs, Jeff Bezos, jimmy wales, John Gruber, John Markoff, Joseph Schumpeter, Kickstarter, lone genius, longitudinal study, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Martin Wolf, new economy, Paul Graham, Peter Thiel, QR code, race to the bottom, reshoring, Richard Florida, Ronald Reagan, shareholder value, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, six sigma, Skype, Steve Ballmer, Steve Jobs, Steve Wozniak, supply-chain management, Tesla Model S, The Chicago School, The Design of Experiments, the High Line, The Myth of the Rational Market, thinkpad, Tim Cook: Apple, too big to fail, tulip mania, We are the 99%, Y Combinator, young professional, Zipcar
As a teenager, he hung out in the garage of his neighbor Larry Lang, an engineer who got Jobs into the Hewlett-Packard Explorers Club. When he needed parts for a frequency counter he was building for the club, he called HP’s CEO Bill Hewlett directly and spoke to him for twenty minutes, according to Walter Isaacson in his biography of Jobs. One could argue that the Apple cofounder’s early introduction to electronics, his friendships with other techies like Steve Wozniak, and his growing up in a hot high-tech culture were instrumental in his development as a tinkerer and designer of computers. Embodiment begins with knowing yourself—who you are, what cultures you belong to, and what you want to create in the world. From the Park Slope mom who builds a babysitting sharing site for her community to the young doctors who embrace social networking in their practices, we all have had experiences that, if mined for their true value, can help us customize our careers and lives.
But the technology of our time—their improved features and lowered costs, their ability to make us all creators and not just passive users—can, in fact, connect people in ways that the films or photographs of seven decades ago could not. As with many of the Creative Intelligence competencies, the road leads back to Apple. Consumers call Apple products “cool” and “easy to use,” and more sophisticated business analysts applaud Apple’s “ecosystem” of integrated software and hardware. But none of those qualities alone explains why we feel the way we do about Apple products; it’s impossible to discuss Apple products without mentioning how they feel in the hand, look to the eye, and connect to our deep emotions. The story of how Apple began creating beautiful, easy-to-use products should be required reading for anyone interested in creating something that’s not just useful but meaningful. WHEN STEVE JOBS RETURNED TO Apple in 1997, after twelve years in exile, he bet the company and his future on a radical new idea: an easy-to-use, stand-alone PC that looked unlike any other computer before it—translucent, colorful, fun.
; http://designmuseum.org/exhibitions/ online/jonathan-ive-on-apple/imac-1998, accessed September 5, 2012; Janet Abrams, “Radical Craft/The Second Art Center Design Conference,” http://www.core77.com/reactor/ 04.06_artcenter.asp, accessed September 5, 2012. 187 Ive then spent yet more: Burrows, “Who Is Jonathan Ive?” 188 They also designed a beautiful: Neil Hughes, “Book Details Apple’s ‘Packaging Room,’ Steve Jobs’ Interest in Advanced Cameras,” Apple Insider, January 24, 2012, accessed September 5, 2012, http://www.appleinsider.com/ articles/12/01/24/book_details_apples_packaging_ room_interests_in_advanced_cameras_.html; Yonu Heisler, “Inside Apple’s Secret Packaging Room,” Network World, January 24, 2012, accessed September 5, 2012, http://www.networkworld.com/community/ blog/inside-apples-secret-packaging-room. 188 The iMac’s launch in 1998: http://www.youtube.com/watch?v=0BHPtoTctDY, accessed September 5, 2012; http://designmuseum.org/design/jonathan-ive, accessed September 5, 2012; John Webb, “10 Success Principles of Apple’s Innovation Master Jonathan Ive,” Innovation Excellence, April 30, 2012, accessed September 5, 2012, http://www.innovationexcellence.com/blog/2012/ 04/30/10-success-principles-of-apples-innovation-master-jonathan-ive/. 188 In a 2006 interview with Peter Burrows: Burrows, “Who Is Jonathan Ive?”
Augmented: Life in the Smart Lane by Brett King
23andMe, 3D printing, additive manufacturing, Affordable Care Act / Obamacare, agricultural Revolution, Airbnb, Albert Einstein, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, Apple II, artificial general intelligence, asset allocation, augmented reality, autonomous vehicles, barriers to entry, bitcoin, blockchain, business intelligence, business process, call centre, chief data officer, Chris Urmson, Clayton Christensen, clean water, congestion charging, crowdsourcing, cryptocurrency, deskilling, different worldview, disruptive innovation, distributed generation, distributed ledger, double helix, drone strike, Elon Musk, Erik Brynjolfsson, Fellow of the Royal Society, fiat currency, financial exclusion, Flash crash, Flynn Effect, future of work, gig economy, Google Glasses, Google X / Alphabet X, Hans Lippershey, Hyperloop, income inequality, industrial robot, information asymmetry, Internet of things, invention of movable type, invention of the printing press, invention of the telephone, invention of the wheel, James Dyson, Jeff Bezos, job automation, job-hopping, John Markoff, John von Neumann, Kevin Kelly, Kickstarter, Kodak vs Instagram, Leonard Kleinrock, lifelogging, low earth orbit, low skilled workers, Lyft, M-Pesa, Mark Zuckerberg, Marshall McLuhan, megacity, Metcalfe’s law, Minecraft, mobile money, money market fund, more computing power than Apollo, Network effects, new economy, obamacare, Occupy movement, Oculus Rift, off grid, packet switching, pattern recognition, peer-to-peer, Ray Kurzweil, RFID, ride hailing / ride sharing, Robert Metcalfe, Satoshi Nakamoto, Second Machine Age, selective serotonin reuptake inhibitor (SSRI), self-driving car, sharing economy, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, Skype, smart cities, smart grid, smart transportation, Snapchat, social graph, software as a service, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, TaskRabbit, technological singularity, telemarketer, telepresence, telepresence robot, Tesla Model S, The Future of Employment, Tim Cook: Apple, trade route, Travis Kalanick, Turing complete, Turing test, uber lyft, undersea cable, urban sprawl, V2 rocket, Watson beat the top human players on Jeopardy!, white picket fence, WikiLeaks
Leonard Kleinrock, UCLA, from an interview on the first ARPANET packet-switching test in 1969 In parallel to the development of early computer networks, various computer manufacturers set about shrinking and personalising computer technology so that it could be used at home or in the office. Contrary to popular belief, IBM wasn’t the first company to create a personal computer (PC). In the early 1970s, Steve Jobs and Steve Wozniak had been busy working on their own version of the personal computer. The result—the first Apple computer (retrospectively known as the Apple I)—actually preceded the IBM model6 by almost five years, and used a very different engineering approach. However, it wasn’t until Apple launched the Apple II that personal computing really became a “thing”. Figure 3.2: An original Apple I computer designed by Jobs and Wozniak and released in 19767 (Credit: Bonhams New York) Around the same time as Jobs and Wozniak’s development of the earliest form of PC, there was also a rapid downsizing of computers in the workplace.
Let’s dive deeper into this concept of adding years to our lives and life to our years, and building our brains while we are at it, before looking at how to activate the protection against linear decay, and upgrading ourselves. Figure 5.5: Apple’s HealthKit measures 67 different categories. (Credit: Apple) QS products started off as separate products and, by 2013, had reached over US$200 million a year in sales, primarily for devices that counted steps and calculated calories based on height, age and weight input by the user. Several apps have duplicated or emulated this functionality in iOS and Androids. Such an app functionality became part of the Apple iOS in 2015, in large part to increase the functionality and usefulness of the Apple Watch. Apple calls this particular app HealthKit. As you can see from the screen shot of Apple’s HealthKit app in figure 5.5, there are seven major categories (body measurements, fitness, me, nutrition, results, sleep and vitals) and 67 separate categories under All, ranging from active calories through to zinc levels.
To illustrate, Kodak at its peak employed 140,000 people, whereas Instagram, arguably the Millennials’ version of Kodak (acquired by Facebook for approximately US$715 million in stock in 2012), only had 13 employees at the time of the acquisition. So it could be postulated that technology is, on a net basis, bad for society when it comes to employment. Apple creates many jobs in its stores, and Foxconn factories reportedly employ 1.23 million people, the majority of whom are dedicated to Apple product manufacturing and assembly, but in terms of the sheer size of the Apple Inc. economy, the company employs relatively few people. Given the efficiency gains that technology brings, does that mean that as technology displaces historical businesses it inevitably destroys jobs? Actually no, that’s not what the research shows at all. “The Internet’s impact on global growth is rising rapidly.
B Is for Bauhaus, Y Is for YouTube: Designing the Modern World From a to Z by Deyan Sudjic
3D printing, additive manufacturing, Albert Einstein, Berlin Wall, Boris Johnson, Buckminster Fuller, call centre, carbon footprint, clean water, dematerialisation, deskilling, edge city, Elon Musk, Frank Gehry, Guggenheim Bilbao, illegal immigration, James Dyson, Jane Jacobs, Kitchen Debate, light touch regulation, market design, megastructure, moral panic, New Urbanism, place-making, QWERTY keyboard, Silicon Valley, Steve Jobs, Steve Wozniak, the scientific method, University of East Anglia, urban renewal, urban sprawl, young professional
With mass production, the point is usually not one of originality, but of what is fake and what is not. There are plenty of stalls in Shanghai or Shenzhen where you can find alleged Rolex wristwatches complete with impressive-looking authenticating holograms that crumble the first time you try the winding mechanism. There are what look something like Apple iPods but which are not produced in the Shenzhen factory that assembles those sold by the company Steve Jobs and Steve Wozniak started in 1976. More worrying for Apple was the way in which Samsung was able to replicate not just the iPhone but also the iPad. Apple claimed that they are copyright infringements, rather than fakes. For a designer, authenticity has taken on a paradoxical aspect. An authentic design might be understood as a design which is more than merely not a fake. It is also an object which is unselfconscious, one which is not shaped by a desire to please or to seduce.
At its peak, in 1997, just before sales of film fell off a cliff, the company was valued at $30 billion. Fifteen years later, Kodak was bankrupt. The company had seen digital photography coming. It built Apple’s QuickTake, launched in 1994, which was one of the first digital cameras aimed at a general audience, even if the price when it first went on sale was $750. But when images are stored as random collections of pixels, rather than on silver and paper, the technical expertise in chemistry and the distribution system it had built up over decades were no longer relevant. There was very little that Kodak could offer Apple. Once digital photography took firm hold, Kodak’s income collapsed, threatening the company’s survival. The end was shockingly rapid. In 2000, the US bought 950 million rolls of film, most of it from Kodak.
The ubiquity of the car has tended to innoculate us to its significance, but we do not have to consider a car as a piece of art to understand the huge impact it has had on the world. It is, for better or worse, the peak of the industrial culture that gave birth to the practice of modern design. Yet its power is waning. The Ford Motor Company, founded by Henry Ford, who was no more comfortable a personality than Steve Jobs, used to be the model of the modern corporation, with its company towns, its own orchestra, its own company uniform. Apple and Google have supplanted Ford and IBM as the model corporations that others seek to emulate. And while there are now companies around the globe who have managed to make cars more profitably, and more effectively, than Ford, they are essentially in the business of refining a mature product that may have a limited future. Bayley’s proposition would have been more convincing but for one of the more unexpected developments of the post-millennial museum-building explosion: the creation of a mass public for contemporary art.
The greatest trade ever: the behind-the-scenes story of how John Paulson defied Wall Street and made financial history by Gregory Zuckerman
1960s counterculture, banking crisis, collapse of Lehman Brothers, collateralized debt obligation, Credit Default Swap, credit default swaps / collateralized debt obligations, financial innovation, fixed income, index fund, Isaac Newton, Long Term Capital Management, margin call, Mark Zuckerberg, Menlo Park, merger arbitrage, mortgage debt, mortgage tax deduction, Ponzi scheme, Renaissance Technologies, rent control, Robert Shiller, Robert Shiller, rolodex, short selling, Silicon Valley, statistical arbitrage, Steve Ballmer, Steve Wozniak, technology bubble, zero-sum game
Burry sold Greenblatt a 22.5 percent piece of the business, using the proceeds to pay off his school loans. He named his firm Scion Capital, inspired by the book The Scions of Shannara, a Terry Brooks fantasy novel. Burry would be a scion of investing greats such as Buffett and Benjamin Graham, although he would chart his own path. Back in California, he rented a small office in a suburban office park, blocks from the headquarters of Apple Computer. The office had been Apple’'s cofounder Steve Wozniak’'s, which Burry took as an auspicious sign. Burry wasn’'t very good at courting clients, but he figured if his results were strong enough, investors would line up. Early on in his fund, after top executives of Avanti Software were charged with stealing secrets from a rival and the stock plunged to $2 per share, Burry determined that customers still were relying on Avanti’'s products.
The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop
Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Duvall, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, conceptual framework, cuban missile crisis, Donald Davies, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, James Watt: steam engine, Jeff Rulifson, John von Neumann, Leonard Kleinrock, Marc Andreessen, Menlo Park, New Journalism, Norbert Wiener, packet switching, pink-collar, popular electronics, RAND corporation, RFC: Request For Comment, Robert Metcalfe, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process, zero-sum game
(The scruffily bearded Jobs wore his T-shirts, jeans, and sandals like a badge of honor.) And partly it was at- tributable to his memory of Steve Wozniak's former employer, Hewlett-Packard, which had once rebuffed Woz's proposal for a microcomputer. But in any case, Jobs changed his mind after repeated urging by Apple engineer Jef Raskin, who had joined the company to help design the Apple II. Raskin had visited PARC, as it happened, and his friends there had shown him its wonders. So on April 2, 1979,Jobs and his team met with the XDC people and struck a deal that could make sense only in the go-go world of Silicon Valley: Xerox would be allowed to invest $1.05 million in Apple's private stock sale, and in re- turn it would allow Apple full access to PARC's technology. Jobs had only the vaguest idea of what that involved, evidently.
On the hardware side, this challenge was taken up most famously by the Apple Computer Company, founded in 1976 by Homebrew Computer Club members Steve Wozniak and Steve Jobs, longtime buddies from the Silicon Val- ley town of Cupertino. After some encouraging success with their first com- puter, which they marketed through local hobby shops-it was actually just a single circuit board using the new, 8-bit 6502 microprocessor from MOS Tech- nology, plus 4 kilobytes of RAM-Jobs and Wozniak were joined by the thirty- four-year-old A. C. Markkula, formerly the marketing manager for Intel. Markkula, who had retired from that company two years earlier after earning 434 THE DREAM MACHINE more than a million dollars in stock options, bought a one-third partnership in Apple for $91,000 and began working his contacts to bring in venture capital and management expertise.
And perhaps most important of all, it was great for playing video games; Wozniak, the technical wizard of the team and a video-game addict himself, had designed it with precisely that use in mind. Of course, the Apple II had plenty of competition in the consumer market, notably from the Commodore PET, which debuted at the same West Coast Computer Faire, and from the Tandy-Radio Shack TRS-80, which was intro- duced the following August. In the beginning, moreover, the promises made in the Regis McKenna ad copy-"You'll be able to organize, index and store data on household finances, income taxes, recipes, your biorhythms, balance your checking account, even control your home environment"-were little more than fantasy; the applications software that would work such magic didn't exist yet. Nonetheless, the Apple II was an instant hit. By decade's end Apple itself had become one of the fastest-growing companies in American history.
Coastal California Travel Guide by Lonely Planet
1960s counterculture, Airbnb, airport security, Albert Einstein, anti-communist, Apple II, Asilomar, back-to-the-land, Bay Area Rapid Transit, Burning Man, buy and hold, California gold rush, call centre, car-free, carbon footprint, Donner party, East Village, El Camino Real, Electric Kool-Aid Acid Test, flex fuel, Frank Gehry, glass ceiling, Golden Gate Park, Haight Ashbury, haute couture, haute cuisine, income inequality, intermodal, Joan Didion, Kickstarter, Loma Prieta earthquake, low cost airline, Lyft, Mason jar, New Journalism, ride hailing / ride sharing, Ronald Reagan, Rosa Parks, Saturday Night Live, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, starchitect, stealth mode startup, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, trade route, transcontinental railway, uber lyft, Upton Sinclair, upwardly mobile, urban sprawl, Wall-E, white picket fence, Whole Earth Catalog, women in the workforce, working poor, Works Progress Administration, young professional, Zipcar
SAN JOSE FOR CHILDREN Children’s Discovery MuseumMUSEUM ( GOOGLE MAP ; %408-298-5437; www.cdm.org; 180 Woz Way; $13, child under 1yr free; h10am-5pm Tue-Sat, noon-5pm Sun; c) Downtown, this science and creativity museum has hands-on displays incorporating art, technology and the environment, with plenty of toys and cool play-and-learn areas for tots to school-aged children. The museum is on Woz Way, named after Steve Wozniak, cofounder of Apple. California's Great AmericaAMUSEMENT PARK ( GOOGLE MAP ; %408-988-1776; www.cagreatamerica.com; 4701 Great America Pkwy, Santa Clara; adult/child under 4ft $69/48; hApr-Oct, hours vary; c) If you can handle the shameless product placements, kids love the roller coasters and other thrill rides. Online tickets cost much less than walk-up prices. Parking is $15 to $25, but the park is also accessible by public transportation.
When Hewlett-Packard introduced the first personal computer in 1968, advertisements breathlessly gushed that the ‘light’ (40lb) machine could ‘take on roots of a fifth-degree polynomial, Bessel functions, elliptic integrals and regression analysis’ – all for just $4900 (almost $35,000 today). Consumers didn’t know quite what to do with computers, but in his 1969 Whole Earth Catalog, author (and former LSD tester for the CIA) Stewart Brand explained that the technology governments used to run countries could empower ordinary people. Hoping to bring computer power to the people, 21-year-old Steve Jobs and Steve Wozniak introduced the Apple II at the 1977 West Coast Computer Faire. Still, the question remained: what would ordinary people do with all that computing power? By the mid-1990s an entire dot-com industry boomed in Silicon Valley with online start-ups, and suddenly people were getting everything – their mail, news, politics, pet food and, yes, sex – online. But when dot-com profits weren’t forthcoming, venture-capital funding dried up and fortunes in stock options disappeared when the dot-com bubble burst and the Nasdaq Stock Market plummeted on March 10, 2000.
Foley SonomaWINERY ( MAP GOOGLE MAP ; %707-433-1944; www.foleysonoma.com; 5110 Hwy 128; tasting $20, incl tour $40; h10am-5pm; p)S Wow, what a view from the hilltop concrete-and-glass tasting room at Foley Sonoma. Winemaker Courtney Foley specializes in Bordeaux varietals and blends, along with Zinfandels and Pinots. An hour-long tour takes guests through the crush pad, barrel room and vineyard and finishes with a tasting. Bottles are $30 to $80. Sebastopol Grapes have replaced apples as the new cash crop, but Sebastopol’s farm-town identity remains rooted in the apple – evidenced by the much-heralded summertime Gravenstein Apple Fair. The town center feels suburban because of traffic, but a hippie tinge gives it color. This is the refreshingly laid-back side of Wine Country and makes a good-value base for exploring the area. 1Sights Around Sebastopol, look for family-friendly farms, gardens, animal sanctuaries and pick-your-own orchards. For a countywide list, check out the Sonoma County Farm Trails Guide (www.farmtrails.org).
Is the Internet Changing the Way You Think?: The Net's Impact on Our Minds and Future by John Brockman
A Declaration of the Independence of Cyberspace, Albert Einstein, AltaVista, Amazon Mechanical Turk, Asperger Syndrome, availability heuristic, Benoit Mandelbrot, biofilm, Black Swan, British Empire, conceptual framework, corporate governance, Danny Hillis, Douglas Engelbart, Douglas Engelbart, Emanuel Derman, epigenetics, Flynn Effect, Frank Gehry, Google Earth, hive mind, Howard Rheingold, index card,