37 results back to index
air freight, Apple II, Bill Gates: Altair 8800, Buckminster Fuller, Byte Shop, computer age, computer vision, corporate governance, El Camino Real, game design, Hacker Ethic, hacker house, Haight Ashbury, John Conway, Mark Zuckerberg, Menlo Park, non-fiction novel, Paul Graham, popular electronics, RAND corporation, reversible computing, Richard Stallman, Silicon Valley, software patent, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Whole Earth Catalog, Y Combinator
The Apple ad even said, “our philosophy is to provide software for our machines free or at minimal cost.” While the selling was going on, Steve Wozniak began working on an expanded design of the board, something that would impress his Homebrew peers even more. Steve Jobs had plans to sell many computers based on this new design, and he started getting financing, support, and professional help for the day the product would be ready. The new version of Steve Wozniak’s computer would be called the Apple II, and at the time no one suspected that it would become the most important computer in history. • • • • • • • • It was the fertile atmosphere of Homebrew that guided Steve Wozniak through the incubation of the Apple II. The exchange of information, the access to esoteric technical hints, the swirling creative energy, and the chance to blow everybody’s mind with a well-hacked design or program . . . these were the incentives which only increased the intense desire Steve Wozniak already had: to build the kind of computer he wanted to play with.
I believe their story—their vision, their intimacy with the machine itself, their experiences inside their peculiar world, and their sometimes dramatic, sometimes absurd “interfaces” with the outside world—is the real story of the computer revolution. Who’s Who: The Wizards and Their Machines Bob Albrecht. Founder of People’s Computer Company who took visceral pleasure in exposing youngsters to computers. Altair 8800. The pioneering microcomputer that galvanized hardware hackers. Building this kit made you learn hacking. Then you tried to figure out what to do with it. Apple II. Steve Wozniak’s friendly, flaky, good-looking computer, wildly successful and the spark and soul of a thriving industry. Atari 800. This home computer gave great graphics to game hackers like John Harris, though the company that made it was loath to tell you how it worked. Bob and Carolyn Box. World-record-holding gold prospectors turned software stars, working for Sierra On-Line. Doug Carlston. Corporate lawyer who chucked it all to form the Brøderbund software company.
TX-0. Filled a small room, but in the late fifties, this $3 million machine was world’s first personal computer—for the community of MIT hackers that formed around it. Jim Warren. Portly purveyor of “techno-gossip” at Homebrew, he was first editor of hippie-styled Dr. Dobbs Journal, later started the lucrative Computer Faire. Randy Wigginton. Fifteen-year-old member of Steve Wozniak’s kiddie corps, he helped Woz trundle the Apple II to Homebrew. Still in high school when he became Apple’s first software employee. Ken Williams. Arrogant and brilliant young programmer who saw the writing on the CRT and started Sierra On-Line to make a killing and improve society by selling games for the Apple computer. Roberta Williams. Ken Williams’ timid wife who rediscovered her own creativity by writing Mystery House, the first of her many bestselling computer games.
air freight, Apple II, Bill Gates: Altair 8800, card file, Chance favours the prepared mind, cuban missile crisis, dumpster diving, Hush-A-Phone, index card, Jason Scott: textfiles.com, Menlo Park, popular electronics, Richard Feynman, Richard Feynman, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, the scientific method, urban renewal, wikimedia commons
While there, Draper claims, he taught the art of phone phreaking to dozens of other inmates. Draper soon went to work for his friend Steve Wozniak at Apple Computer, designing an innovative product called the Charley Board. Charley was an add-in circuit board for the Apple II that connected the computer to the telephone line. With Charley and a few simple programs you could make your Apple II do all sorts of telephonic tricks. Not only could it dial telephone numbers and send touch tones down the line, it could even listen to the calls it placed and recognize basic telephone signals as the call progressed, signals such as a dial tone or busy signal or a ringing signal. With the right programming it could be used as a modem. An Apple II with a Charley Board, in fact, became the ultimate phone phreaking tool. Just as the phone company thought it was natural to mix computers and phone switches, John Draper thought it was natural to mix computers and phone phreaking.
Every one needed hardware and software hackers to help them. Riches, or promises of riches, or maybe just a fun job that might pay the bills beckoned. In 1976 former phone phreaks Steve Jobs and Steve Wozniak were selling Apple I computers to their fellow hobbyists. “Jobs placed ads in hobbyist publications and they began selling Apples for the price of $666.66,” journalist Steven Levy wrote. “Anyone in Homebrew could take a look at the schematics for the design, Woz’s BASIC was given away free with the purchase of a piece of equipment that connected the computer to a cassette recorder.” The fully assembled and tested Apple II followed later that year. By 1977 microcomputers had begun to enter the mainstream. You could stroll down to your local Radio Shack and buy a TRS-80 microcomputer off the shelf, something absolutely unheard of just a year earlier.
Excerpt from IWOZ: COMPUTER GEEK TO CULT ICON: HOW I INVENTED THE PERSONAL COMPUTER, COFOUNDED APPLE, AND HAD FUN DOING IT by Steve Wozniak and Gina Smith. Copyright © 2006 by Steve Wozniak and Gina Smith. Used by permission of W. W. Norton & Company, Inc. ISBN-13: 978-0-8021-9375-9 Grove Press an imprint of Grove/Atlantic, Inc. 841 Broadway New York, NY 10003 Distributed by Publishers Group West www.groveatlantic.com 13 14 15 16 10 9 8 7 6 5 4 3 2 1 To the men and women of the Bell System, and especially to the members of the technical staff of Bell Laboratories, without whom none of this would have been possible CONTENTS FOREWORD BY STEVE WOZNIAK A NOTE ON NAMES AND TENSES CHAPTER 1 FINE ARTS 13 CHAPTER 2 BIRTH OF A PLAYGROUND CHAPTER 3 CAT AND CANARY CHAPTER 4 THE LARGEST MACHINE IN THE WORLD CHAPTER 5 BLUE BOX CHAPTER 6 “SOME PEOPLE COLLECT STAMPS” CHAPTER 7 HEADACHE CHAPTER 8 BLUE BOX BOOKIES CHAPTER 9 LITTLE JOJO LEARNS TO WHISTLE CHAPTER 10 BILL ACKER LEARNS TO PLAY THE FLUTE CHAPTER 11 THE PHONE FREAKS OF AMERICA PHOTO INSERT CHAPTER 12 THE LAW OF UNINTENDED CONSEQUENCES CHAPTER 13 COUNTERCULTURE CHAPTER 14 BUSTED CHAPTER 15 PRANKS CHAPTER 16 THE STORY OF A WAR CHAPTER 17 A LITTLE BIT STUPID CHAPTER 18 SNITCH CHAPTER 19 CRUNCHED CHAPTER 20 TWILIGHT CHAPTER 21 NIGHTFALL EPILOGUE SOURCES AND NOTES ACKNOWLEDGMENTS INDEX THE PLAYGROUND Phone phreak (n.) 1.
Founders at Work: Stories of Startups' Early Days by Jessica Livingston
8-hour work day, affirmative action, AltaVista, Apple II, Brewster Kahle, business process, Byte Shop, Danny Hillis, don't be evil, fear of failure, financial independence, Firefox, full text search, game design, Googley, HyperCard, illegal immigration, Internet Archive, Jeff Bezos, Maui Hawaii, Menlo Park, nuclear winter, Paul Buchheit, Paul Graham, Peter Thiel, Richard Feynman, Richard Feynman, Sand Hill Road, side project, Silicon Valley, slashdot, social software, software patent, South of Market, San Francisco, Startup school, stealth mode startup, Steve Ballmer, Steve Jobs, Steve Wozniak, web application, Y Combinator
Little did he know that I was actually up all night writing a business plan, not partying. C H A P T 3 E R Steve Wozniak Cofounder, Apple Computer If any one person can be said to have set off the personal computer revolution, it might be Steve Wozniak. He designed the machine that crystallized what a desktop computer was: the Apple II. Wozniak and Steve Jobs founded Apple Computer in 1976. Between Wozniak’s technical ability and Jobs’s mesmerizing energy, they were a powerful team. Woz first showed off his home-built computer, the Apple I, at Silicon Valley’s Homebrew Computer Club in 1976. After Jobs landed a contract with the Byte Shop, a local computer store, for 100 preassembled machines, Apple was launched on a rapid ascent. Woz soon followed with the machine that made the company: the Apple II. He single-handedly designed all its hardware and software—an extraordinary feat even for the time.
So I said to him, “Now that games are software, it’s going to be a different world for games.” And the Apple II, so many people just started trying to figure out how can you get rocket ships to launch, how can you get things that sound like sound when you have a real cruddy voltage to a speaker. How do you listen to somebody talk and figure out what they said? They started using the Apple II. It was just open to all these things. We made it easy for anyone to do what they wanted to do. And I think that was one of the biggest keys to its success. We didn’t make it a hidden machine that we own—we sell it, it does this, you got it—like Commodore and RadioShack did. We put out manuals that had just hundreds of pages of listings of code, descriptions of circuits, examples of boards that you would plug in—so that Steve Wozniak 51 anyone could look at this and say, “Now I know how I would do my own.”
I can’t remember the name of the company out of the East, but a venture group. They came in and met us all early on, and they did put in . . . Mike figured out that we were going to need some cash, we were going to be so fast growing. And when you are fast growing, you need more cash right away. So we did have a venture deal in place from well before we shipped an Apple II. And sometime after we were shipping the Apple IIs, we got, I think, $800,000 or $300,000—some large amount—from one venture capital place. Steve Wozniak 57 Livingston: On the East Coast? Wozniak: I believe that’s where we arranged it. Mike Markkula had worked with this guy Hank Smith at Intel, so that’s how they knew each other. And I think Don Valentine actually put some money in, but then it came to a point where he wanted to make some good money and buy some stock off Steve Jobs for like $5.50 before we went public. $5.50 a share, and Steve thought it was too low.
Commodore: A Company on the Edge by Brian Bagnall
Apple II, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, Douglas Engelbart, Firefox, game design, index card, inventory management, Isaac Newton, low skilled workers, Menlo Park, packet switching, pink-collar, popular electronics, prediction markets, pre–internet, QWERTY keyboard, Robert X Cringely, Silicon Valley, special economic zone, Steve Jobs, Steve Wozniak, Ted Nelson
Unfortunately, the Apple II emitted strong radio interference signals, which the FCC does not allow in consumer devices. To get around the rules, Apple pretended it did not intend the Apple II for the home market. According to Yannes, “They basically said, ‘Okay, this is a data processing device and therefore the Class A FCC rules apply to it (the more relaxed rules) because this is going to be used in an industrial environment and not in a home.’ They couldn’t put an R/F modulator in it to hook to your TV set because obviously that was something for the home.” Without an R/F modulator, the Apple II was too complicated for inexperienced users. “The PET and the TRS-80 both came with their own monitors, so they were a more appropriate solution for most people than the Apple II was,” says Yannes. The original design by Steve Wozniak also had several flaws. “Right after the Apple II came out, Electronic Engineering Times wrote a story about the three major design flaws that Woz made on the Apple II,” says Peddle.
Steve Jobs was trying to talk Commodore into buying the Apple II for a large amount, like hundreds of thousands of dollars. … Steve Jobs also wanted Commodore to hire us along with the proposed deal. The deal was never on paper and never concrete, as to how much.” “The discussions never got beyond a meeting with Jack and Steve,” says Peddle. “Jack’s view was that Steve wanted much too much for the company and in the fall of 1976 he was right. I remember him laughing about Steve Jobs and his view of his company’s position.” Tramiel was willing to purchase Apple, but he wanted the lowest price possible. To do that, he put pressure on Jobs by refusing his initial offer. “Basically Jack decided to go along with it, but he tried to squeeze Steve,” explains Peddle. Steve Wozniak tells a different story. “We were told that Chuck wanted to do his own thing and that he could do better than us at reaching the cheap needs of customers.
 According to Herd, “Russell was later rewarded with a $20,000 bonus for coming up with this solution.”  Compute! Magazine, October 1982.  One of these users was Linus Torvalds. In 1981, the nine-year-old received a VIC-20 from his grandfather and used it to learn BASIC programming. He was taking the first steps which eventually led him to create a revolution with Linux.  Time magazine, “The Hottest-Selling Hardware” (January 3, 1983), p. 37.  Steve Wozniak has attempted to claim the Apple II was the first to a million. On BBC World’s Most Powerful, aired December 2003, Wozniak claimed, “Sales shot sky high. Apple was the first company to sell a hundred thousand computers—a million computers.” CHAPTER 29 Selling the Revolution 1982 With the engineering job complete (or at least good enough), Charlie Winterble’s team reluctantly stepped away from the Commodore 64.
accounting loophole / creative accounting, Alfred Russel Wallace, Apple II, barriers to entry, British Empire, Burning Man, Cass Sunstein, Clayton Christensen, don't be evil, Douglas Engelbart, Howard Rheingold, Hush-A-Phone, informal economy, intermodal, Internet Archive, invention of movable type, invention of the telephone, invisible hand, Jane Jacobs, Joseph Schumpeter, Menlo Park, open economy, packet switching, PageRank, profit motive, road to serfdom, Ronald Coase, shareholder value, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, Telecommunications Act of 1996, The Chicago School, The Death and Life of Great American Cities, the market place, The Wisdom of Crowds, too big to fail, Upton Sinclair, urban planning
The Apple’s operating system, using a form of BASIC as its programming language and operating environment, was, moreover, one that anyone could program. It made it possible to write and sell one’s programs directly, creating what we now call the “software” industry. In 2006, I briefly met with Steve Wozniak on the campus of Columbia University. “There’s a question I’ve always wanted to ask you,” I said. “What happened with the Mac? You could open up the Apple II, and there were slots and so on, and anyone could write for it. The Mac was way more closed. What happened?” “Oh,” said Wozniak. “That was Steve. He wanted it that way. The Apple II was my machine, and the Mac was his.” Apple’s origins were pure Steve Wozniak, but as everyone knows, it was the other founder, Steve Jobs, whose ideas made Apple what it is today. Jobs maintained the early image that he and Wozniak created, but beginning with the Macintosh in the 1980s, and accelerating through the age of the iPod, iPhone, and iPad, he led Apple computers on a fundamentally different track.
And it was Wozniak who would conceive of and build the Apple and the Apple II, the most important Apple products ever, and arguably among the most important inventions of the later twentieth century.* For his part, Jobs was the businessman and the dealmaker of the operation, essential as such, but hardly the founding genius of Apple computers, the man whose ideas were turned into silicon to change the world; that was Wozniak. The history of the firm must be understood in this light. For while founders do set the culture of a firm, they cannot dictate it in perpetuity; as Wozniak withdrew from the operation, Apple became more and more concerned with, as it were, the aesthetics of radicalism than with its substance. Steve Wozniak is not the household name that Steve Jobs is, but his importance to communications and culture in the postwar period merits a closer look.
Now as they spoke, a warm glow began to develop between a Montague and a Capulet who fantasized about all-embracing alliance between their seemingly irreconcilable houses. Theirs was to be a union that could move mountains—or at least break down the old barriers and create a perfect new world.2 The two moguls plotting the future of the Internet had something else in common: neither was what you might call a natural computer geek, in the manner of Bill Gates or Steve Jobs. Entrepreneurs like Apple’s Steve Wozniak got started by programming and soldering; Case was an assistant brand manager at Procter & Gamble in Kansas. He might have languished somewhere in upper middle management had he not resolved to grab the ring. Case took a job at a risky computer networking firm named the Control Video Corporation that had already failed twice. Three’s a charm, however: by some miracle, that firm eventually managed to become America Online.3 Once a corporate lawyer, Levin during these years was working as a cable executive.
Becoming Steve Jobs: The Evolution of a Reckless Upstart Into a Visionary Leader by Brent Schlender, Rick Tetzeli
Albert Einstein, Apple II, Apple's 1984 Super Bowl advert, Bill Gates: Altair 8800, Byte Shop, computer age, corporate governance, El Camino Real, Isaac Newton, Jony Ive, market design, McMansion, Menlo Park, Paul Terrell, popular electronics, QWERTY keyboard, Ronald Reagan, Sand Hill Road, side project, Silicon Valley, Silicon Valley startup, skunkworks, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Tim Cook: Apple, Wall-E, Watson beat the top human players on Jeopardy!, Whole Earth Catalog
Part of my unease came from the fact that, for the first time in my experience as a journalist, I would be calling on a prominent business leader who was younger than I. I was thirty-two years old; Jobs was thirty-one and already a global celebrity, hailed, along with Bill Gates, for having invented the personal computer industry. Long before Internet mania started churning out wunderkinds of the week, Jobs was technology’s original superstar, the real deal with an astounding, substantial record. The circuit boards he and Steve Wozniak had assembled in a garage in Los Altos had spawned a billion-dollar company. The personal computer seemed to have unlimited potential, and as the cofounder of Apple Computer, Steve Jobs had been the face of all those possibilities. But then, in September of 1985, he had resigned under pressure, shortly after telling the company’s board of directors that he was courting some key Apple employees to join him in a new venture to build computer “workstations.”
Rock loved order, he loved processes, he believed that tech companies grew in certain ways according to certain rules, and he subscribed to these beliefs because he’d seen them work before, most notably at Intel, the great Santa Clara chipmaker that he had backed early on. Rock was perhaps the most notable tech investor of his time, but he in fact had been reluctant to back Apple at first, largely because he’d found Steve and his partner Steve Wozniak unpalatable. He didn’t see Apple the way Jobs saw it—as an extraordinary company that would humanize computing and do so with a defiantly unhierarchical organization. Rock simply viewed it as another investment. Steve found board meetings with Rock enervating, not invigorating; he had looked forward to a long, fast drive to Marin with the top down to get rid of the stale stench of seemingly endless discussion.
For the third time in three years, he tried to hire Lasseter away from Pixar. Lasseter wouldn’t go. “I was living in the San Francisco Bay Area,” he remembers. “I was inventing new stuff. I figured I’d just stay on here. I’d had a pretty miserable experience at Disney.” He told Schneider that there was only one way that he’d consider working with Disney—the studio would have to make a movie with Pixar. The Evolution of a CEO Steve Jobs and Steve Wozniak in 1979. The two had founded Apple four years earlier, and the company was growing like crazy. But the best years of their collaboration were already over. Ted Thai/Polaris A 1979 gathering of the Seva Foundation, which Steve backed with a $5,000 donation. His close friend Larry Brilliant is at the center with his baby boy, Joseph; Brilliant’s wife, Girija, is to the right, arms crossed and leaning back.
Apple II, augmented reality, autonomous vehicles, bioinformatics, Build a better mousetrap, business process, cloud computing, computer vision, cyber-physical system, distributed generation, game design, Grace Hopper, Richard Feynman, Richard Feynman, Silicon Valley, skunkworks, Skype, smart transportation, speech recognition, statistical model, stealth mode startup, Steve Jobs, Steve Wozniak, the market place, Yogi Berra
Calvert: These are the things that an inventor really needs to learn and work out before taking that big step of sending a patent application to the USPTO—hopefully as the prelude to starting up their own business. 1 www.uspto.gov/inventors/independent/eye/201206/index.jsp 2 www.uiausa.org CHAPTER 23 Steve Wozniak Co-Founder Apple Computer A Silicon Valley icon and philanthropist for more than thirty years, Steve Wozniak helped shape the computing industry with his design of Apple’s first line of products, the Apple I and II, and influenced the popular Macintosh. In 1976, Wozniak and Steve Jobs founded Apple Computer Inc. with Wozniak’s Apple I personal computer. The following year he introduced his Apple II personal computer, featuring a central processing unit, a keyboard, color graphics, and a floppy disk drive. The Apple II was integral to launching the personal computer industry. Wozniak is named sole inventor on the US patent for “microcomputer for use with video display.”
Tim Leatherman, Folding Hand Tools Chapter 15. Reyn Guyer, Toys Chapter 16. Bernhard van Lengerich, Food Manufacturing Chapter 17. Curt Croley, Shane MacGregor, Graham Marshall, Mobile Devices Chapter 18. Matthew Scholz, Healthcare Products Chapter 19. Daria Mochly-Rosen, Drugs Chapter 20. Martin Keen, Footwear Chapter 21. Kevin Deppermann, Seed Genomes Chapter 22. John Calvert, Elizabeth Dougherty, USPTO Chapter 23. Steve Wozniak, Personal Computers Index About the Author Brett Stern is an industrial designer and inventor living in Portland, Oregon. He holds eight utility patents covering surgical instruments, medical implants, and robotic garmentmanufacturing systems. He holds trademarks in 34 countries on a line of snack foods that he created. He has worked as an industrial design consultant for such clients as Pfizer, Revlon, and Saatchi & Saatchi, and as a costume materials technologist for Warner Bros.
Today, at the dawn of the nexus of the future, ideas for inventions stand only a small chance of being realized and competing in the marketplace unless they’re generated or picked up by corporations that can marshal teams of scientists and lawyers underwritten by enterprise-scale capital and infrastructure. Nonetheless, millions of individuals still cherish the dream of inventing and building a better mousetrap, bringing it to market, and being richly rewarded for those efforts. Americans love their pantheon of garage inventors. Thomas Edison, the Wright Brothers, Alexander Graham Bell, Bill Hewlett and Dave Packard, and Steve Wozniak and Steve Jobs are held up as culture heroes, celebrated for their entrepreneurial spirit no less than their inventive genius. This book is a collection of interviews conducted with individuals who have distinguished themselves in the invention space. Some of the inventors interviewed here have their Aha! moments in government, institutional, or industrial labs; develop their inventions with multidisciplinary teams of experts; and leave the marketing of their inventions to other specialists in the organization.
Equal Is Unfair: America's Misguided Fight Against Income Inequality by Don Watkins, Yaron Brook
3D printing, Affordable Care Act / Obamacare, Apple II, barriers to entry, Berlin Wall, Bernie Madoff, blue-collar work, business process, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, collective bargaining, colonial exploitation, corporate governance, correlation does not imply causation, Credit Default Swap, crony capitalism, David Brooks, deskilling, Edward Glaeser, Elon Musk, en.wikipedia.org, financial deregulation, immigration reform, income inequality, indoor plumbing, inventory management, invisible hand, Isaac Newton, Jeff Bezos, Jony Ive, laissez-faire capitalism, Louis Pasteur, low skilled workers, means of production, minimum wage unemployment, Naomi Klein, new economy, obamacare, Peter Singer: altruism, Peter Thiel, profit motive, rent control, Ronald Reagan, Silicon Valley, Skype, statistical model, Steve Jobs, Steve Wozniak, The Spirit Level, too big to fail, trickle-down economics, Uber for X, urban renewal, War on Poverty, women in the workforce, working poor
Spending on the Basics as a Share of Disposable Personal Income,” HumanProgress.org, http://humanprogress.org/static/us-spending-on-basics (accessed April 13, 2015). 5. Steve Wozniak with Gina Smith, iWoz: Computer Geek to Cult Icon (New York: Norton, 2006), pp. 12–13. 6. Ibid., p. 18. 7. Ibid., pp. 54–55. 8. Ibid., pp. 155–56. 9. “National Inventors Hall of Fame,” Ohio History Central, http://www.ohiohistorycentral.org/w/National_Inventors_Hall_of_Fame?rec=1727 (accessed August 31, 2015). 10. Quoted in Sean Rossman, “Apple’s ‘The Woz’ Talks Jobs, Entrepreneurship,” Tallahassee Democrat, November 6, 2014, http://www.tallahassee.com/story/news/local/2014/11/05/apples-woz-talks-jobs-entrepreneurship/18561425/ (accessed April 13, 2015). 11. Quoted in Alec Hogg, “Apple’s ‘Other’ Steve—Wozniak on Jobs, Starting a Business, Changing the World, and Staying Hungry, Staying Foolish,” BizNews.com, February 17, 2014, http://www.biznews.com/video/2014/02/17/apples-other-steve-wozniak-on-jobs-starting-a-business-changing-the-world/ (accessed April 13, 2015). 12.
But we do live on a Glorious Earth, where we can make life amazing. And it can be amazing for everyone, because it turns out that the way we improve our lives—ingenuity and effort—is not a fixed-sum game, where we battle over a static amount of wealth. We produce wealth, and there is no limit to how much wealth we can produce. Who Created the Modern World? In his autobiography, Apple cofounder Steve Wozniak, or Woz, as he’s usually called, describes how his dad, an engineer, would explain to the four-year-old Woz how electronics worked. “I remember sitting there and being so little, and thinking: ‘Wow, what a great, great world he’s living in,’” Woz recalls. “I mean, that’s all I thought: ‘Wow.’ For people who know how to do this stuff—how to take these little parts and make them work together to do something—well, these people must be the smartest people in the world. . . .
Maybe you teach music to middle-schoolers. Maybe you fix cars or perform brain surgery or, God help you, write books on inequality. Whatever it is, you do productive work in exchange for money, which you use to buy the dizzying array of things that other people produce. But that’s only part of the story. Not all work is equally productive. Some of us create a little wealth. Some of us create a lot. A tiny handful, like Steve Wozniak, create so much that their names go down in the history books. Think of some of the things that make your life wonderful. Your cell phone, your computer, the Internet? You can thank Robert Noyce and Jack Kilby, who invented the integrated circuit. The car that took you to work? You can thank Henry Ford, who transformed the automobile from a curiosity of the rich into a mass-market product.
The Hacker Crackdown by Bruce Sterling
Apple II, back-to-the-land, game design, ghettoisation, Haight Ashbury, Howard Rheingold, HyperCard, index card, informal economy, Jaron Lanier, pirate software, Plutocrats, plutocrats, Silicon Valley, Steve Wozniak, Steven Levy, Stewart Brand, the scientific method, Whole Earth Catalog, Whole Earth Review
Before computers and their phone-line modems entered American homes in gigantic numbers, phone phreaks had their own special telecommunications hardware gadget, the famous "blue box." This fraud device (now rendered increasingly useless by the digital evolution of the phone system) could trick switching systems into granting free access to long-distance lines. It did this by mimicking the system's own signal, a tone of 2600 hertz. Steven Jobs and Steve Wozniak, the founders of Apple Computer, Inc., once dabbled in selling blue-boxes in college dorms in California. For many, in the early days of phreaking, blue-boxing was scarcely perceived as "theft," but rather as a fun (if sneaky) way to use excess phone capacity harmlessly. After all, the long-distance lines were JUST SITTING THERE.... Whom did it hurt, really? If you're not DAMAGING the system, and you're not USING UP ANY TANGIBLE RESOURCE, and if nobody FINDS OUT what you did, then what real harm have you done?
On the contrary, like most rock musicians, the Grateful Dead have spent their entire adult lives in the company of complex electronic equipment. They have funds to burn on any sophisticated tool and toy that might happen to catch their fancy. And their fancy is quite extensive. The Deadhead community boasts any number of recording engineers, lighting experts, rock video mavens, electronic technicians of all descriptions. And the drift goes both ways. Steve Wozniak, Apple's co-founder, used to throw rock festivals. Silicon Valley rocks out. These are the 1990s, not the 1960s. Today, for a surprising number of people all over America, the supposed dividing line between Bohemian and technician simply no longer exists. People of this sort may have a set of windchimes and a dog with a knotted kerchief 'round its neck, but they're also quite likely to own a multimegabyte Macintosh running MIDI synthesizer software and trippy fractal simulations.
Furthermore, proclaimed the manifesto, the foundation would "fund, conduct, and support legal efforts to demonstrate that the Secret Service has exercised prior restraint on publications, limited free speech, conducted improper seizure of equipment and data, used undue force, and generally conducted itself in a fashion which is arbitrary, oppressive, and unconstitutional." "Crime and Puzzlement" was distributed far and wide through computer networking channels, and also printed in the Whole Earth Review. The sudden declaration of a coherent, politicized counter-strike from the ranks of hackerdom electrified the community. Steve Wozniak (perhaps a bit stung by the NuPrometheus scandal) swiftly offered to match any funds Kapor offered the Foundation. John Gilmore, one of the pioneers of Sun Microsystems, immediately offered his own extensive financial and personal support. Gilmore, an ardent libertarian, was to prove an eloquent advocate of electronic privacy issues, especially freedom from governmental and corporate computer-assisted surveillance of private citizens.
Apple II, bounce rate, Byte Shop, Cal Newport, capital controls, cleantech, Community Supported Agriculture, deliberate practice, financial independence, follow your passion, Frank Gehry, job satisfaction, job-hopping, knowledge worker, Mason jar, medical residency, new economy, passive income, Paul Terrell, popular electronics, renewable energy credits, Results Only Work Environment, Richard Bolles, Richard Feynman, Richard Feynman, rolodex, Sand Hill Road, side project, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, web application, winner-take-all economy
It’s at this point that Jobs’s ascent begins to accelerate. He takes on $250,000 in funding from Mark Markkula and works with Steve Wozniak to produce a new computer design that is unambiguously too good to be ignored. There were other engineers in the Bay Area’s Homebrew Computer Club culture who could match Jobs’s and Wozniak’s technical skill, but Jobs had the insight to take on investment and to focus this technical energy toward producing a complete product. The result was the Apple II, a machine that leaped ahead of the competition: It had color graphics; the monitor and keyboard were integrated inside the case; the architecture was open, allowing rapid expansion of memory and peripherals (such as the floppy disk, which the Apple II was the first to introduce into mainstream use). This was the product that put the company on the map and that pushed Jobs from a small-time entrepreneur into the head of a visionary company.
During this period, Jobs split his time between Atari and the All-One Farm, a country commune located north of San Francisco. At one point, he left his job at Atari for several months to make a mendicants’ spiritual journey through India, and on returning home he began to train seriously at the nearby Los Altos Zen Center. In 1974, after Jobs’s return from India, a local engineer and entrepreneur named Alex Kamradt started a computer time-sharing company dubbed Call-in Computer. Kamradt approached Steve Wozniak to design a terminal device he could sell to clients to use for accessing his central computer. Unlike Jobs, Wozniak was a true electronics whiz who was obsessed with technology and had studied it formally at college. On the flip side, however, Wozniak couldn’t stomach business, so he allowed Jobs, a longtime friend, to handle the details of the arrangement. All was going well until the fall of 1975, when Jobs left for the season to spend time at the All-One commune.
Among others, I introduced Apple founder Steve Jobs, radio host Ira Glass, and master surfboard shaper Al Merrick. Using this trio as our running example, I can now ask what it is specifically about these three careers that makes them so compelling? Here are the answers that I came up with: TRAITS THAT DEFINE GREAT WORK Creativity: Ira Glass, for example, is pushing the boundaries of radio, and winning armfuls of awards in the process. Impact: From the Apple II to the iPhone, Steve Jobs has changed the way we live our lives in the digital age. Control: No one tells Al Merrick when to wake up or what to wear. He’s not expected in an office from nine to five. Instead, his Channel Island Surfboards factory is located a block from the Santa Barbara beach, where Merrick still regularly spends time surfing. ( Jake Burton Carpenter, founder of Burton Snowboards, for example, recalls how negotiations for the merger between the two companies happened while he and Merrick waited for waves in a surf lineup.)
Intertwingled: The Work and Influence of Ted Nelson (History of Computing) by Douglas R. Dechow
3D printing, Apple II, Bill Duvall, Brewster Kahle, Buckminster Fuller, Claude Shannon: information theory, cognitive dissonance, computer age, conceptual framework, Douglas Engelbart, Dynabook, Edward Snowden, game design, HyperCard, hypertext link, information retrieval, Internet Archive, Jaron Lanier, knowledge worker, linked data, Marshall McLuhan, Menlo Park, Mother of all demos, pre–internet, RAND corporation, semantic web, Silicon Valley, software studies, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, the medium is the message, Vannevar Bush, Wall-E, Whole Earth Catalog
Technical compromises made in the early days of the World Wide Web undermined Ted’s ability to implement hypertext on a large scale. He continues to rail at this constraint. Forty years after Computer Lib, computers are far more sophisticated and the networks among digital objects are much richer and more complex. It is time to revisit fundamental assumptions of networked computing, such as the directionality of links, a point made by multiple speakers at the symposium—Wendy Hall, Jaron Lanier, Steve Wozniak, and Rob Akcsyn amongst them.1 Fig. 10.3Ordinary hypertext, with multi-directional links. From Literary Machines (Used with permission) 10.2.3 Managing Research Data Managing research data is similarly a problem of defining and maintaining relationships amongst multi-media objects. Research data do not stand alone. They are complex objects that can be understood only in relation to their context, which often includes software, protocols, documentation, and other entities scattered over time and space .
Wardrip-Fruin N, Montfort N (eds) (2003) The new media reader. MIT Press, Cambridge, MA 17. Wing JM (2006) Computational thinking. Commun ACM 49(3) 18. Wozniak S (2014) In “Intertwingled: afternoon session #2.” Chapman University, Orange, California. Video timecode: 58:14. http://ibc.chapman.edu/Mediasite/Play/52694e57c4b546f0ba8814ec5d9223ae1d Footnotes 1For example, as Steve Wozniak said at Intertwingled, “At our computer club, the bible was Computer Lib” — referring to the Homebrew Computer Club, from which Apple Computer and other major elements of the turn to personal computers emerged . 2“Computational thinking is the process of recognising aspects of computation in the world that surrounds us, and applying tools and techniques from Computer Science to understand and reason about both natural and artificial systems and processes” . 3“Computational Media” has recently emerged as a name for the type of work that performs this interdisciplinary integration . 4Kodu is both an influential system itself and the basis of Microsoft’s Project Spark, launched in October 2014. 5The first stage of our work is described in “Say it With Systems” .
From the age of about three or four I was particularly fascinated by “exclusive or” light switches, where you have a room with the need for switches at two different doors and so they are wired up in such a way that both switches control the light and you can turn it on or off from either door. As a child I then went on to explore in sequence: electricity, electronics, digital electronics and early computers. We had ancient computers at my school. We had a PDP-8 and then an LSI-11 and an Apple II and so on up through the history of computers. I was interested in each level of hardware: how the physics of transistors worked, how digital circuits were put together, and how CPUs operated. When I was young, I designed a simple CPU and a simple operating system. I asked my brother to sit underneath a desk, fed him instructions, and had him execute them. In parallel with that interest, I have also always been interested in culture, both national cultures and popular culture.
Alexey Pajitnov wrote Tetris, Apple II, cellular automata, Columbine, Conway's Game of Life, game design, In Cold Blood by Truman Capote, Mars Rover, Mikhail Gorbachev, Ralph Waldo Emerson, Ray Oldenburg, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, The Great Good Place, Thorstein Veblen, urban planning
Tobey was a boy genius, a brilliant but occasionally arrogant artistic phenomenon who was working on computer games while still in high school, toiling at babysitting jobs to pay for his $800 Commodore Pet 2001. Tobey spent most of his time at the computer trying to make a game that was as close to real life as a computer in the 1980s could make it. Through word of mouth, Tobey’s flying and shooting game based on F-15 fighter jets came to the attention of Apple’s Steve Wozniak when Tobey was just sixteen. Wozniak was wowed at the sound, graphics, and game play. He kept saying, “This can’t be done on the Apple II. I can’t believe it. This can’t be done.” He gave Tobey a calling card and added a note to Trip Hawkins, which read, “Please consider this flight simulator as the finest Apple game ever done.” Hawkins didn’t waste any time. He wanted to make a deal right away. Tobey’s parents came with him to EA’s offices to oversee a lucrative royalty deal for Skyfox, a game that would eventually sell more than a million copies.
From plastic dust they were born and to plastic dust and desert sand they returned. In 1975 that plastic hadn’t been worthless at all. It was precious gold to the principals of Atari, and it would only become more valuable as the decade progressed. Atari’s arcade business was still thriving, and Home Pong exceeded sales expectations, and demand exceeded supply. Alcorn hired an unkempt and unshaven Steve Jobs, who in turn asked his best friend, the diffident genius Steve Wozniak, for help with what would be one of Atari’s most popular additions to its ever expanding library. Without telling Alcorn, Bushnell asked Jobs to help him streamline the innards of a brick-breaking arcade game called Breakout. Bushnell wanted to save money because the chips used in each arcade machine were still pricey at the time. He coaxed the brazen, odoriferous Jobs with $750 and a $100 bonus for each chip removed from the prototype.
The graphics and play in the inaugural Madden effort that was finally released for the Apple II in June 1989 were like caveman drawings when measured by today’s standards. The art looked like a cheap cartoon. Only sixteen of the NFL’s twenty-eight teams were represented. While the real players were there, the teams’ logos weren’t. And while the stats for each player were carefully honed for realism’s sake, every player looked the same. On the cover of the first game, the smiling Madden, holding a football running back–style, looks as much surprised as he is happy. It’s as if he’s about to say, “Gee, I know football, but what’s this videogame thing all about?” Nonetheless, the gaming world went wild over the game. Nibble, an Apple II enthusiast magazine of the time, detailed the many functions of the game and highlighted the news that you could make your own plays “if you’re really serious about football.”
Apple II, Brian Krebs, Burning Man, corporate governance, dumpster diving, Exxon Valdez, Hacker Ethic, hive mind, index card, McMansion, Mercator projection, offshore financial centre, packet switching, pirate software, Ponzi scheme, Robert Hanssen: Double agent, Saturday Night Live, Silicon Valley, Steve Jobs, Steve Wozniak, Steven Levy, traffic fines, web application, WikiLeaks, zero day, Zipcar
But hacking was above all a creative effort, one that would lead to countless watershed moments in computer history. The word “hacker” took on darker connotations in the early 1980s, when the first home computers—the Commodore 64s, the TRS-80s, the Apples—came to teenagers’ bedrooms in suburbs and cities around the United States. The machines themselves were a product of hacker culture; the Apple II, and with it the entire home computer concept, was born of two Berkeley phone phreaks named Steve Wozniak and Steve Jobs. But not all teenagers were content with the machines, and in the impatience of youth, they weren’t inclined to wait for grad school to dip into real processing power or to explore the global networks that could be reached with a phone call and the squeal of a modem. So they began illicit forays into corporate, government, and academic systems and took their first tentative steps into the ARPANET, the Internet’s forerunner.
See http://www.securityfocus.com/comments/articles/203/5729/threaded (May 24, 2001). Max says he did not consider himself an informant and only provided technical information. Chapter 4: The White Hat 1 The first people to identify themselves as hackers: The seminal work on the early hackers is Steven Levy, Hackers: Heroes of the Computer Revolution (New York: Anchor Press/Doubleday, 1984). Also see Steve Wozniak and Gina Smith, iWoz: From Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It (New York: W. W. Norton and Company, 2006). 2 Tim was at work one day: This anecdote was recalled by Tim Spencer. Max later recalled Spencer’s advice in a letter to his sentencing judge in Pittsburgh. 3 If there was one thing Max: Details of Max’s relationship with Kimi come primarily from interviews with Kimi. 4 Max went up to the city to visit Matt Harrigan: Harrigan’s business and his work with Max were described primarily by Harrigan, with some details confirmed by Max.
Jony Ive: The Genius Behind Apple's Greatest Products by Leander Kahney
Apple II, banking crisis, British Empire, Dynabook, global supply chain, interchangeable parts, Jony Ive, race to the bottom, RFID, side project, Silicon Valley, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, the built environment, thinkpad, Tim Cook: Apple
It was more than four years since Brunner had written his conceptual brief. With the much-anticipated twentieth anniversary of the Macintosh approaching, the decision was made to designate Spartacus as a special edition. Officially named the Twentieth Anniversary Macintosh, the new product was limited to a run of just twenty thousnd units. Apple unveiled it at Macworld in January 1997 and the first two units were given to Steve Jobs and Steve Wozniak, who had just returned to the company as advisers. To make it more memorable, the machine was hand-delivered to customers’ homes by specially trained “concierges,” who set up the machines, installed any expansion cards (along with the ugly hunchback) and showed users how to use them. “I think it is the first sensible computer design that we have seen in a long time,” said Henry Steiner, Hong Kong’s most eminent graphic designer.
“I think it is the first sensible computer design that we have seen in a long time,” said Henry Steiner, Hong Kong’s most eminent graphic designer. “It is quite beautiful and desirable. It has the status value of a Porsche. The fact that the machine combines computer, television and stereo system is impressive.” Like the MessagePad, the Twentieth Anniversary Macintosh (TAM) won not only kudos but awards, including the Best of Category prize for I.D. magazine’s Annual Design Review. Steve Wozniak thought it was the perfect college machine “with the computer, TV, radio, CD player and more (AV even) all in one sleek machine.” He had several at his mansion in the hills of Los Gatos above Silicon Valley. By the time the machine was pulled from the market one year after launch, however, Wozniak seemed to be the only person on the planet who liked it. The TAM bombed in the marketplace.
In addition, it shouldn’t take up too much space on a desk, so Jobs and his design team decided it should have an unusual vertical orientation, with the disk drive below the monitor, instead of to the side like other machines at the time. The design process continued for several months, with a sequence of prototypes and endless discussions. Material evaluations led to the use of tough ABS plastic that was used to make LEGO bricks, which would give the new machine a fine, scratch-resistant texture. Troubled by the way earlier Apple IIs had turned orange in sunlight over time, Manock decided to make the Macintosh beige, initiating a trend that would last twenty years. As Jony would do in the next generation at Apple, Jobs paid close attention to every detail. Even the mouse was designed to reflect the shape of the computer, with the same proportions, and a single square button that corresponded to the shape and placement of the screen.
Dogfight: How Apple and Google Went to War and Started a Revolution by Fred Vogelstein
Apple II, cloud computing, disintermediation, don't be evil, Dynabook, Firefox, Google Chrome, Google Glasses, Googley, Jony Ive, Mark Zuckerberg, Peter Thiel, pre–internet, Silicon Valley, Silicon Valley startup, Skype, software patent, spectrum auction, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Tim Cook: Apple, web application
Head of Global Marketing Phil Schiller went to Chicago. Jony Ive and his design crew went to San Francisco. Steve Jobs’s store was, naturally, the one in downtown Palo Alto at the corner of University Avenue and Kipling Street. It was a mile and a half from his house and he often showed up there unannounced when he was in town. The appropriate high-tech luminaries had already gathered when he arrived. Apple cofounder Steve Wozniak and early Apple employees Bill Atkinson and Andy Hertzfeld were already standing on line. But it also seemed as if Jobs had some internal flames to fan of his own, said one of the engineers who was there along with Grignon and many others who had worked on the project, including Fadell and Forstall. “So there’s this reunion of the original Mac guys, and it’s really cool. And then Steve goes up to Tony [Fadell] and proceeds to go over in a corner of the store and talk to him for an hour and ignore Forstall just to fuck with him.”
But many others decided they now needed only two, and they started ditching their Microsoft-run Dell, HP, Toshiba, Acer, and Lenovo laptops at an accelerating clip. The shift hit Dell so hard that by the beginning of 2013 it was trying to take itself private to retrench. Jobs was particularly satisfied with this development, a confidant said—even though in the context of the other upheavals the iPad was unleashing it was almost a footnote. Thirty-five years after starting Apple with Steve Wozniak, Jobs was finally doing what he had set out to do all along: he was transforming what consumers and businesses expected from their computers. The Macintosh in 1984—the first mainstream machine to use a mouse—was supposed to have been the machine that did this. It was supposed to have taken a complicated device—the PC—and made it a consumer product that anyone could use. That failed. As everyone knows, Macs didn’t go away, but Microsoft Windows and Office get the credit for making the PC mainstream.
Fadell was getting ready to start his own company when Apple’s head of hardware, Jon Rubinstein, called, trying to recruit Fadell for a job that, astonishingly, he was not allowed to disclose. According to Steven Levy’s book The Perfect Thing, Fadell took the call on a ski slope in Colorado in January and expressed interest on the spot. He had idolized Apple since he was twelve, according to Levy. That was when he’d spent the summer of ’81 caddying to save up enough money to buy an Apple II. Weeks after Rubinstein’s call, Fadell joined Apple, only discovering then that he was being hired as a consultant to help build the first iPod. Grignon and others have said that Fadell’s rise never sat well with Forstall. Up until Fadell joined Apple, Jobs’s inner circle was composed of people he’d worked closely with at least from the beginning of his return in 1997, and in some cases from his days running NeXT, the computer company he’d founded after getting fired from Apple in 1985.
Any sufficiently advanced technology is indistinguishable from magic, Apple II, back-to-the-land, Bill Duvall, Bill Gates: Altair 8800, Buckminster Fuller, California gold rush, card file, computer age, computer vision, conceptual framework, cuban missile crisis, Douglas Engelbart, Dynabook, El Camino Real, general-purpose programming language, Golden Gate Park, Hacker Ethic, hypertext link, informal economy, information retrieval, invention of the printing press, Jeff Rulifson, John Nash: game theory, John von Neumann, Kevin Kelly, knowledge worker, Mahatma Gandhi, Menlo Park, Mother of all demos, Norbert Wiener, packet switching, Paul Terrell, popular electronics, QWERTY keyboard, RAND corporation, RFC: Request For Comment, Richard Stallman, Robert X Cringely, Sand Hill Road, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, speech recognition, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Thorstein Veblen, Turing test, union organizing, Vannevar Bush, Whole Earth Catalog, William Shockley: the traitorous eight
Terminal, TV typewriter? I/O Device? or some other digital black-magic box? Or are you buying time on a time-sharing service? If so you might like to come to a gathering of people with like-minded interests. Exchange information, swap ideas, talk shop, help work on a project, whatever…13 One person who saw the flyer was Allen Baum, who was working at Hewlett-Packard at the time with his friend Steve Wozniak. The two had met in high school when Baum had seen Wozniak sitting in his homeroom class drawing strange graphics in a notebook. “What are you doing?” Baum asked. “I’m designing a computer,” was Wozniak’s reply. It turned out that Baum had on his own become intrigued with computers just months earlier after his father, who had moved the family from the East Coast, took a job at Stanford Research Institute.
This interview is the clearest and most comprehensive account of Engelbart’s career, and I have relied on it extensively. 2.Ibid. 3.Ibid. 4.There is some confusion on this point. At various times Engelbart has said that he found the original article in the library and at other times he has said he believed he first read the Life account of Vannevar Bush’s Memex. Whatever the case, it had a defining impact on him. 5.Vannevar Bush, “As We May Think,” Atlantic Monthly, July 1945. 6.Lowood and Adams, oral history. 7.Ibid. Twenty years later, a young Steve Wozniak, then a brand-new HP engineer, would ask the company if they wanted to sell a personal computer. HP said it wasn’t interested, and Wozniak went off to cofound Apple Computer. It was the second time the Silicon Valley pioneer missed an opportunity to define the future of computing. 8.Ibid. 9.Jack Goldberg, Stanford Research Institute, e-mail to author. 10.Author interview, Charles Rosen, Menlo Park, Calif., October 10, 2001. 11.Douglas C.
A frustrated Engelbart began to explore the idea of remotely connecting to the SDC computer from the Control Data minicomputer in Menlo Park using an early modem. Unfortunately his engineers were never able to make the system communicate reliably. As a result, for the next two years Engelbart’s fledgling Augmented Human Intellect Research Center began to build his system on a computer that had far less processing power than an Apple II of a decade and a half later. The Menlo Park computer used the magnetic-core memory that Engelbart, Crane, and English had all worked on improving in the fifties. It had a capacity of eight thousand twelve-bit characters—a little more than three pages of typed text—in its main memory. Instead of on a disk drive, it stored information permanently on a rotating drum that could hold thirty-two thousand characters.
Dealers of Lightning by Michael A. Hiltzik
Apple II, Apple's 1984 Super Bowl advert, Bill Duvall, Bill Gates: Altair 8800, computer age, Dynabook, El Camino Real, index card, Jeff Rulifson, Joseph Schumpeter, Marshall McLuhan, Menlo Park, oil shock, popular electronics, Ronald Reagan, Silicon Valley, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, the medium is the message, Vannevar Bush, Whole Earth Catalog
The idea was not wholly implausible. Apple was coming on strong. Started in the proverbial Silicon Valley garage by Jobs and his high school classmate Steve Wozniak, Apple had successfully negotiated the transition in its product line from kit versions of Woz’s little personal computer to a more versatile version, the Apple II. This machine was unique in the hobbyist market. It came already assembled, with a keyboard (although it required a separate monitor). Shortly after Jobs’s appearance before Zarem’s group, Apple started bundling it with VisiCalc, a unique software program known as a financial spreadsheet—a “killer app” that would single-handedly turn the Apple II into a popular businessman’s tool. With fewer than forty employees in 1978, Apple was already one of the most sought-after investments among the small community of speculative private investors known as venture capitalists.
January: The Altair 8800, a hobbyist’s personal computer sold as a mail-order kit, is featured on the cover of Popular Electronics, enthralling a generation of youthful technology buffs—among them, Bill Gates—with the possibilities of personal computing. February: PARC engineers demonstrate for their colleagues a graphical user interface for a personal computer, including icons and the first use of pop-up menus, that will develop into the Windows and Macintosh interfaces of today. March 1: PARC’s permanent headquarters at 3333 Coyote Hill Road are formally opened. January 3: Apple Computer is incorporated by Steve Jobs and Steve Wozniak. August: Having perfected a new technology for designing high-density computer chips at PARC, Lynn Conway and Carver Mead begin drafting Introduction to VLSI Systems, a textbook on the technology that is written and typeset entirely on desktop publishing systems invented at the center. August 18: Xerox shelves a plan to market the Alto as a commercial project, closing the door to any possibility that the company will be in the vanguard of personal computing.
With BravoX nearing completion he was unsure of his next step, especially given the absence of any sign that Xerox meant to follow up ASD’s market probes with a full-scale merchandising program. He had only grown more restive when a friend showed him an Apple II running VisiCalc. The spreadsheet program was new to him but dazzling in its power. One typed numbers or formulas into the cells of a grid and linked them, so the answer from one cell could be part of the formula of another. This allowed anyone to tabulate data in an infinite number of permutations. It was particularly valuable for businessmen and engineers, who could perform “what-if” analyses simply by altering a figure here or there and letting the grid automatically calculate the myriad ramifications of the change. Sure enough, within months VisiCalc had transformed the Apple II into a commercial sensation. By contrast, at PARC, where funds had flowed so limitlessly that no one ever felt the urge to run “what-if” budget scenarios, the spreadsheet idea had not even occurred to the greatest software engineers in the world.
Apple II, collective bargaining, computer age, George Gilder, informal economy, laissez-faire capitalism, low skilled workers, means of production, Menlo Park, Murray Gell-Mann, open economy, Richard Feynman, Richard Feynman, Ronald Reagan, Sand Hill Road, Silicon Valley, Silicon Valley startup, Steve Jobs, Steve Wozniak, union organizing, War on Poverty, women in the workforce, Yom Kippur War
This certainly was true of Apple Computer, which was financed by men associated with Fairchild and Intel and staffed with many people from Hewlett-Packard and Intel.46 Apple had gotten its start in 1976, when 19-year-old Jobs convinced his friend Steve Wozniak, who had developed a personal computer in his garage, to start a business with him. The two showed their computer to venture capitalist Don Valentine (a former Fairchild salesman), who suggested they contact Mike Markkula, recently retired (at age 34) from his job in Intel’s marketing group. Markkula, who had long dreamed of something like a personal computer—as a teenager, he had built a “programmable electronic sliderule”—invested $91,000 in the company. In exchange, he received a one-third ownership stake in Apple.47 One of Markkula’s first calls on behalf of Apple was to Noyce. “I want you to be aware of this,” Markkula said. “I’d like to present to the [Intel] board.” Noyce gave his approval and on the appointed day, Markkula and Steve Wozniak gave a presentation about the personal computer, an Apple II on hand for demonstration purposes.
But he was interesting enough to talk to, and soon Bowers found herself engrossed in what she called “all Steve’s schemes,” only half of which she thought were even remotely feasible. Clearly this was a company that needed her help. She agreed to consult for Apple.49 A few months into her consulting work, Bowers learned that Steve Wozniak wanted to sell some of his founders’ stock for $13 a share. She bought it from him. “Bob thought I was nuts,” she recalls. Noyce did not try to stop her from investing—they had long ago agreed that she could do what she liked with her money, and he could do the same with his—but he could not take Jobs and Wozniak seriously. Even Arthur Rock admits, “Steve Jobs and Steve Wozniak weren’t very appealing people in those days.” Wozniak was the telephone-era’s version of a hacker—he used a small box that emitted electronic tones to call around the world for free—and Steve Jobs’s ungroomed appearance was offputting to Noyce.
Noyce gave his approval and on the appointed day, Markkula and Steve Wozniak gave a presentation about the personal computer, an Apple II on hand for demonstration purposes. “If you want to participate in this in some way, say so,” Markkula told the board. “If you don’t, fine. But this Renewal 251 is something you should have in front of your consciousness.” Intel had not given much thought to the personal computer since Moore squelched Noyce and Gelbach’s plans to go head to head with Altair. The board listened politely and asked a few questions, but no one proposed a relationship between Intel and Apple that went beyond Intel’s possibly providing the microprocessor for Apple Computers. “Nothing else was really in Intel’s best interest,” Markkula acknowledges.48 But Arthur Rock had paid careful attention to Markkula and Wozniak’s presentation. A few days later, he called Markkula’s office.
Age of Context: Mobile, Sensors, Data and the Future of Privacy by Robert Scoble, Shel Israel
Albert Einstein, Apple II, augmented reality, call centre, Chelsea Manning, cloud computing, connected car, Edward Snowden, Elon Musk, factory automation, Filter Bubble, Google Earth, Google Glasses, Internet of things, job automation, Kickstarter, Mars Rover, Menlo Park, New Urbanism, PageRank, pattern recognition, RFID, ride hailing / ride sharing, Saturday Night Live, self-driving car, sensor fusion, Silicon Valley, Skype, smart grid, social graph, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Tesla Model S, Tim Cook: Apple, urban planning, Zipcar
Some years will pass before people look back and try to understand how they ever could have lived without such a device. Scoble tells audiences it’s like seeing the first Apple IIs as they rolled off the assembly line in 1977: They were like nothing people had seen before, but you couldn’t do much with them. Decision makers at HP and Atari weren’t interested in cutting a deal with Steve Wozniak and Steve Jobs for rights to market their new computer—the new, highly personalized devices were obviously too radically different to sell in significant quantity. Yet, it turned out a lot of people wanted them and the Apple II kicked off a 20-year explosion of invention and productivity that we now remember as the PC revolution. Google Glass will do the same. How long will it take? We’re not sure.
Overcomplicated: Technology at the Limits of Comprehension by Samuel Arbesman
3D printing, algorithmic trading, Anton Chekhov, Apple II, Benoit Mandelbrot, citation needed, combinatorial explosion, Danny Hillis, David Brooks, discovery of the americas, en.wikipedia.org, Erik Brynjolfsson, Flash crash, friendly AI, game design, Google X / Alphabet X, Googley, HyperCard, Inbox Zero, Isaac Newton, iterative process, Kevin Kelly, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, mandelbrot fractal, Minecraft, Netflix Prize, Nicholas Carr, Parkinson's law, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, Second Machine Age, self-driving car, software studies, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, Stewart Brand, superintelligent machines, Therac-25, Tyler Cowen: Great Stagnation, urban planning, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Y2K
A self-taught genius who worked during the early part of the twentieth century, Ramanujan was not your average mathematician who tried to solve problems through trial and error and occasional flashes of brilliance. Instead, equations seemed to leap fully formed from his brain, often mind-bogglingly complex and stunningly correct (though some were also wrong). The Ramanujan of technology might be Steve Wozniak. Wozniak programmed the first Apple computer and was responsible for every aspect of the Apple II. As the programmer and novelist Vikram Chandra notes, “Every piece and bit and byte of that computer was done by Woz, and not one bug has ever been found. . . . Woz did both hardware and software. Woz created a programming language in machine code. Woz is hardcore.” Wozniak was on a level of technological understanding that few can reach. We can even see the extremes of our brain’s capacity—as well as how its limits can be stretched—in the way London cabdrivers acquire and use what is known as The Knowledge.
abstraction, 163 biological thinking’s avoidance of, 115–16 in complexity science, 133, 135 in physics thinking, 115–16, 121–22, 128 specialization and, 24, 26–27 technological complexity and, 23–28, 81, 121–22 accretion, 65 in complex systems, 36–43, 51, 62, 65, 191 in genomes, 156 in infrastructure, 42, 100–101 legacy systems and, 39–42 in legal system, 40–41, 46 in software, 37–38, 41–42, 44 in technological complexity, 130–31 unexpected behavior and, 38 aesthetics: biological thinking and, 119 and physics thinking, 113, 114 aggregation, diffusion-limited, 134–35 algorithm aversion, 5 Amazon, 5 American Philosophical Society, 90 Anaximander of Miletus, 139 Apple, 161, 163 Apple II computer, 77 applied mathematics, 143 arche, 140 Ariane 5 rocket, 1996 explosion of, 11–12 Aristotle, 151 Ascher, Kate, 100 Asimov, Isaac, 124 atomic nucleus, discovery of, 124, 141 Audubon, John James, 109 autocorrect, 5, 16 automobiles: self-driving, 91, 231–32 software in, 10–11, 13, 45, 65, 100, 174 see also Toyota automobiles Autonomous Technology (Winner), 22 Average Is Over (Cowen), 84 awe, as response to technological complexity, 6, 7, 154–55, 156, 165, 174 bacteria, 124–25 Balkin, Jack, 60–61 Ball, Philip, 12, 87–88, 136, 140 Barr, Michael, 10 Barrow, Isaac, 89 BASIC, 44–45 Bayonne Bridge, 46 Beacock, Ian, 12–13 Benner, Steven, 119 “Big Ball of Mud” (Foote and Yoder), 201 binary searches, 104–5 biological systems, 7 accretion in, 130–31 complexity of, 116–20, 122 digital technology and, 49 kluges in, 119 legacy code in, 118, 119–20 modules in, 63 tinkering in, 118 unexpected behavior in, 109–10, 123–24 biological thinking, 222 abstraction avoided in, 115–16 aesthetics and, 119 as comfortable with diversity and complexity, 113–14, 115 concept of miscellaneous in, 108–9, 140–41, 143 as detail oriented, 121, 122, 128 generalization in, 131–32 humility and, 155 physics thinking vs., 114–16, 137–38, 142–43, 222 technological complexity and, 116–49, 158, 174 Blum, Andrew, 101–2 Boeing 777, 99 Bogost, Ian, 154 Bookout, Jean, 10 Boorstin, Daniel, 89 Borges, Jorge Luis, 76–77, 131 Boston, Mass., 101, 102 branch points, 80–81 Brand, Stewart, 39–40, 126, 198–99 Brookline, Mass., 101 Brooks, David, 155 Brooks, Frederick P., Jr., 38, 59, 93 bugs, in software, see software bugs bureaucracies, growth of, 41 cabinets of curiosities (wunderkammers), 87–88, 140 calendar application, programming of, 51–53 Cambridge, Mass., 101 cancer, 126 Carew, Diana, 46 catastrophes, interactions in, 126 Challenger disaster, 9, 11, 12, 192 Chandra, Vikram, 77 Chaos Monkey, 107, 126 Chekhov, Anton, 129 Chekhov’s Gun, 129 chess, 84 Chiang, Ted, 230 clickstream, 141–42 Clock of the Long Now, The (Brand), 39–40 clouds, 147 Code of Federal Regulations, 41 cognitive processing: of language, 73–74 limitations on, 75–76, 210 nonlinear systems and, 78–79 outliers in, 76–77 working memory and, 74 see also comprehension, human collaboration, specialization and, 91–92 Commodore VIC-20 computer, 160–61 complexity, complex systems: acceptance of, see biological thinking accretion in, 36–43, 51, 62, 65, 191 aesthetics of, 148–49, 156–57 biological systems and, 116–17, 122 buoys as examples of, 14–15, 17 complication vs., 13–15 connectivity in, 14–15 debugging of, 103–4 edge cases in, 53–62, 65, 201, 205 feedback and, 79, 141–45 Gall on, 157–58, 227 hierarchies in, 27, 50–51 human interaction with, 163 infrastructure and, 100–101 inherent vs. accidental, 189 interaction in, 36, 43–51, 62, 65, 146 interconnectivity of, see interconnectivity interpreters of, 166–67, 229 kluges as inevitable in, 34–36, 62–66, 127 in legal systems, 85 and limits of human comprehension, 1–7, 13, 16–17, 66, 92–93 “losing the bubble” and, 70–71, 85 meaning of terms, 13–20 in natural world, 107–10 scientific models as means of understanding, 165–67 specialization and, 85–93 unexpected behavior in, 27, 93, 96–97, 98–99, 192 see also diversity; technological complexity complexity science, 132–38, 160 complication, complexity vs., 13–15 comprehension, human: educability of, 17–18 mystery and, 173–74 overoptimistic view of, 12–13, 152–53, 156 wonder and, 172 see also cognitive processing comprehension, human, limits of, 67, 212 complex systems and, 1–7, 13, 16–17, 66, 92–93 humility as response to, 155–56 interconnectivity and, 78–79 kluges and, 42 legal system and, 22 limitative theorems and, 175 “losing the bubble” in, 70–71, 85 Maimonides on, 152 stock market systems and, 26–27 technological complexity and, 18–29, 69–70, 80–81, 153–54, 169–70, 175–76 unexpected behavior and, 18–22, 96–97, 98 “Computational Biology” (Doyle), 222 computational linguistics, 54–57 computers, computing: complexity of, 3 evolutionary, 82–84, 213 impact on technology of, 3 see also programmers, programming; software concealed electronic complexity, 164 Congress, U.S., 34 Constitution, U.S., 33–34 construction, cost of, 48–50 Cope, David, 168–69, 229–30 corpus, in linguistics, 55–56 counting: cognitive limits on, 75 human vs. computer, 69–70, 97, 209 Cowen, Tyler, 84 Cryptonomicon (Stephenson), 128–29 “Crystalline Structure of Legal Thought, The” (Balkin), 60–61 Curiosity (Ball), 87–88 Dabbler badge, 144–45 dark code, 21–22 Darwin, Charles, 115, 221, 227 Daston, Lorraine, 140–41 data scientists, 143 datasets, massive, 81–82, 104–5, 143 debugging, 103–4 Deep Blue, 84 diffusion-limited aggregation (DLA), 134–35 digital mapping systems, 5, 49, 51 Dijkstra, Edsger, 3, 50–51, 155 “Divers Instances of Peculiarities of Nature, Both in Men and Brutes” (Fairfax), 111–12 diversity, 113–14, 115 see also complexity, complex systems DNA, see genomes Doyle, John, 222 Dreyfus, Hubert, 173 dwarfism, 120 Dyson, Freeman, on unity vs. diversity, 114 Dyson, George, 110 Economist, 41 edge cases, 53–62, 65, 116, 128, 141, 201, 205, 207 unexpected behavior and, 99–100 see also outliers Einstein, Albert, 114 Eisen, Michael, 61 email, evolution of, 32–33 emergence, in complex systems, 27 encryption software, bugs in, 97–98 Enlightenment, 23 Entanglement, Age of, 23–29, 71, 92, 96, 97, 165, 173, 175, 176 symptoms of, 100–102 Environmental Protection Agency, 41 evolution: aesthetics and, 119 of biological systems, 117–20, 122 of genomes, 118, 156 of technological complexity, 127, 137–38 evolutionary computation, 82–84, 213 exceptions, see edge cases; outliers Facebook, 98, 189 failure, cost of, 48–50 Fairfax, Nathanael, 111–12, 113, 140 fear, as response to technological complexity, 5, 7, 154–55, 156, 165 Federal Aviation Administration (FAA), Y2K bug and, 37 feedback, 14–15, 79, 135 Felsenstein, Lee, 21 Fermi, Enrico, 109 Feynman, Richard, 9, 11 field biologists, 122 for complex technologies, 123, 126, 127, 132 financial sector: interaction in, 126 interconnectivity of, 62, 64 see also stock market systems Firthian linguistics, 206 Flash Crash (2010), 25 Fleming, Alexander, 124 Flood, Mark, 61, 85 Foote, Brian, 201 Fortran, 39 fractals, 60, 61, 136 Frederick the Great, king of Prussia, 89 fruit flies, 109–10 “Funes the Memorious” (Borges), 76–77, 131 Galaga, bug in, 95–96, 97, 216–17 Gall, John, 157–58, 167, 227 game theory, 210 garden path sentences, 74–75 generalists, 93 combination of physics and biological thinking in, 142–43, 146 education of, 144, 145 explosion of knowledge and, 142–49 specialists and, 146 as T-shaped individuals, 143–44, 146 see also Renaissance man generalization, in biological thinking, 131–32 genomes, 109, 128 accretion in, 156 evolution of, 118, 156 legacy code (junk) in, 118, 119–20, 222 mutations in, 120 RNAi and, 123–24 Gibson, William, 176 Gingold, Chaim, 162–63 Girl Scouts, 144–45 glitches, see unexpected behavior Gmail, crash of, 103 Gödel, Kurt, 175 “good enough,” 27, 42, 118, 119 Goodenough, Oliver, 61, 85 Google, 32, 59, 98, 104–5 data centers of, 81–82, 103, 189 Google Docs, 32 Google Maps, 205 Google Translate, 57 GOTO command, 44–45, 81 grammar, 54, 57–58 gravitation, Newton’s law of, 113 greeblies, 130–31 Greek philosophy, 138–40, 151 Gresham College, 89 Guide of the Perplexed, The (Maimonides), 151 Haldane, J.
The Rise of the Network Society by Manuel Castells
Apple II, Asian financial crisis, barriers to entry, Big bang: deregulation of the City of London, borderless world, British Empire, capital controls, complexity theory, computer age, Credit Default Swap, declining real wages, deindustrialization, delayed gratification, dematerialisation, deskilling, disintermediation, double helix, Douglas Engelbart, edge city, experimental subject, financial deregulation, financial independence, floating exchange rates, future of work, global village, Hacker Ethic, hiring and firing, Howard Rheingold, illegal immigration, income inequality, industrial robot, informal economy, information retrieval, intermodal, invention of the steam engine, invention of the telephone, inventory management, James Watt: steam engine, job automation, job-hopping, knowledge economy, knowledge worker, labor-force participation, labour market flexibility, labour mobility, laissez-faire capitalism, low skilled workers, manufacturing employment, Marshall McLuhan, means of production, megacity, Menlo Park, new economy, New Urbanism, offshore financial centre, oil shock, open economy, packet switching, planetary scale, popular electronics, post-industrial society, postindustrial economy, prediction markets, Productivity paradox, profit maximization, purchasing power parity, RAND corporation, Robert Gordon, Silicon Valley, Silicon Valley startup, social software, South China Sea, South of Market, San Francisco, special economic zone, spinning jenny, statistical model, Steve Jobs, Steve Wozniak, Ted Nelson, the built environment, the medium is the message, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, total factor productivity, trade liberalization, transaction costs, urban renewal, urban sprawl
In 1975, Ed Roberts, an engineer who had created a small calculator company, MITS, in Albuquerque, New Mexico, built a computing box with the improbable name of Altair, after a character in the Star Trek TV series, that was the object of admiration of the inventor’s young daughter. The machine was a primitive object, but it was built as a small-scale computer around a microprocessor. It was the basis for the design of Apple I, then of Apple II, the first commercially successful micro-computer, realized in the garage of their parents’ home by two young school drop-outs, Steve Wozniak and Steve Jobs, in Menlo Park, Silicon Valley, in a truly extraordinary saga that has by now become the founding legend of the Information Age. Launched in 1976, with three partners and $91,000 capital, Apple Computers had by 1982 reached $583 million in sales, ushering in the age of diffusion of computer power. IBM reacted quickly: in 1981 it introduced its own version of the microcomputer, with a brilliant name: the Personal Computer (PC), which became in fact the generic name for microcomputers.
But a few indications seem to point to the fact that they were intentionally trying to undo the centralizing technologies of the corporate world, both out of conviction and as their market niche. As evidence, I recall the famous Apple Computer 1984 advertising spot to launch Macintosh, in explicit opposition to Big Brother IBM of Orwellian mythology. As for the countercultural character of many of these innovators, I shall also refer to the life story of the genius developer of the personal computer, Steve Wozniak: after quitting Apple, bored by its transformation into another multinational corporation, he spent a fortune for a few years subsidizing rock groups that he liked, before creating another company to develop technologies of his taste. At one point, after having created the personal computer, Wozniak realized that he had no formal education in computer sciences, so he enrolled at UC Berkeley.
In 1988, it could be estimated that “venture capital accounted for about one-half of the new product and service investment associated with the information and communication industry.”68 A similar process took place in the development of the microcomputer, which introduced an historical divide in the uses of information technology.69 By the mid-1970s Silicon Valley had attracted tens of thousands of bright young minds from around the world, coming to the excitement of the new technological Mecca in a quest for the talisman of invention and money. They gathered in loose groups, to exchange ideas and information on the latest developments. One such gathering was the Home Brew Computer Club, whose young visionaries (including Bill Gates, Steve Jobs, and Steve Wozniak) would go on to create in the following years up to 22 companies, including Microsoft, Apple, Comenco, and North Star. It was the club’s reading, in Popular Electronics, of an article reporting Ed Roberts’s Altair machine which inspired Wozniak to design a microcomputer, Apple I, in his Menlo Park garage in the summer of 1976. Steve Jobs saw the potential, and together they founded Apple, with a $91,000 loan from an Intel executive, Mike Markkula, who came in as a partner.
Albert Einstein, Andrew Keen, Apple II, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, Burning Man, butterfly effect, computer age, crowdsourcing, cuban missile crisis, Dissolution of the Soviet Union, don't be evil, Douglas Engelbart, Dynabook, East Village, Edward Lorenz: Chaos theory, Fall of the Berlin Wall, Francis Fukuyama: the end of history, Frank Gehry, Grace Hopper, gravity well, Guggenheim Bilbao, Honoré de Balzac, Howard Rheingold, invention of movable type, Isaac Newton, Jacquard loom, Jacquard loom, Jane Jacobs, Jeff Bezos, John von Neumann, Mark Zuckerberg, Marshall McLuhan, Mercator projection, Mother of all demos, mutually assured destruction, Network effects, new economy, Norbert Wiener, PageRank, pattern recognition, planetary scale, Plutocrats, plutocrats, Post-materialism, post-materialism, Potemkin village, RFID, Richard Feynman, Richard Feynman, Richard Stallman, Robert X Cringely, Schrödinger's Cat, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, Skype, social software, spaced repetition, Steve Ballmer, Steve Jobs, Steve Wozniak, Ted Nelson, the built environment, The Death and Life of Great American Cities, the medium is the message, Thomas L Friedman, Turing machine, Turing test, urban planning, urban renewal, Vannevar Bush, walkable city, Watson beat the top human players on Jeopardy!, William Shockley: the traitorous eight
Jobs and Gates started out when personal computing, that idea advanced by Licklider the Patriarch and Kay the Aquarian, was the province of a tiny group of obsessed hobbyists. It was a business, but one with a smaller market than ﬂy-ﬁshing. As teenagers in the 1970s, Jobs and Gates were part of this small group of hobbyists who purchased kits to make simple, programmable computers to use (and play with) at home. Jobs, along with Steve Wozniak, were members of the best-known group of these enthusiasts, the Homebrew Computer Club of Cupertino, California. Gates, who had been programming since he found himself able to get access to a DEC mainframe in high school, was already writing software professionally while he was a student at Harvard. Jobs and Gates, along with their collaborators and competitors in the mid-1970s, were positioned at a fulcrum point, when a diversion turned into a business.
What made them both rich and powerful was their ability to meld the attributes of the two generations that preceded them—fusing the hardheaded business logic of the Plutocrats with the visionary futurity of the Aquarians. 163 GENERATIONS Jobs and Gates have an interesting competitive history, leapfrogging each other in the quest to achieve “insane greatness,” in Jobs’s words, and global market preeminence, for Gates.21 Jobs and his partner, Wozniak, were the ﬁrst to make the leap from hobbyists to industrialists with their Apple computers, launched in 1976. It was the Apple II that really broke loose, in 1977, attracting a huge user base, and establishing Jobs and Wozniak as the ﬁrst publicly lauded millionaire whiz kids of Silicon Valley. As important as their early success with the Apple II was, however, their greatest impact came seven years later, when they took the inspiration of people like Engelbart and Kay, and created a mass-market personal computer that set a new standard for participation. Before we get to that, we need to return to 1976, and move from Silicon Valley to New Mexico, where Gates and his partners, including former Harvard friends Paul Allen and Steve Ballmer, were writing programs for the Altair computer.
He worked for early electronic games pioneer Atari in the late 1970s and visited Xerox PARC, where he saw the work infused with Engelbart and Kay’s Aquarian vision. This spirit resonated with Jobs, who at one point had taken a personal pilgrimage to India and lived in an ashram. But even more so, the meme of participation entered his head on those visits to PARC. The Apple II, released in 1977, was unique in having a graphics capability and a soundboard built in. Here was the ﬁrst major computer for the masses, designed from the start as a multimedia machine. These Apple IIs became the de facto machines in classrooms around the country, and without a doubt prepared a generation of computer users for what was to come. Jobs understood that the graphical user interface would open up a whole new range of applications to nonexpert users, but even more would expand that user community exponentially.
Geek Sublime: The Beauty of Code, the Code of Beauty by Vikram Chandra
Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple II, barriers to entry, Berlin Wall, British Empire, business process, conceptual framework, create, read, update, delete, crowdsourcing, East Village, European colonialism, finite state, Firefox, Flash crash, glass ceiling, Grace Hopper, haute couture, iterative process, Jaron Lanier, John von Neumann, land reform, London Whale, Paul Graham, pink-collar, revision control, Silicon Valley, Silicon Valley ideology, Skype, Steve Jobs, Steve Wozniak, theory of mind, Therac-25, Turing machine, wikimedia commons, women in the workforce
After such knowledge, reverence is the only proper emotion; the narrator tells his Big Boss that he can’t fix the error because he can’t find it. I didn’t feel comfortable hacking up the code of a Real Programmer.12 Despite the allusion above to “the *macho* side of programming,” the non-geek may not fully grasp that within the culture of programmers, Mel es muy macho. The Real Programmer squints his eyes, does his work, and rides into the horizon to the whistling notes of Ennio Morricone. To you, Steve Wozniak may be that cuddly penguin who was on a few episodes of Dancing with the Stars, and by all accounts, he really is the good, generous man one sees in interviews. But within the imaginations of programmers, Woz is also a hard man, an Original Gangsta: he wired together his television set and a keyboard and a bunch of chips on a circuit board and so created the Apple I computer. Then he realized he needed a programming language for the microprocessor he’d used, and none existed, so Woz—who had never taken a language-design class—read a couple of books, wrote a compiler, and then wrote a programming language called Integer BASIC in machine code.
And when we say “wrote” this programming language we mean that he wrote the assembly code in a paper notebook on the right side of the pages, and then transcribed it into machine code on the left.13 And he did all this while holding down a full-time job at Hewlett-Packard: “I designed two computers and cassette tape interfaces and printer interfaces and serial ports and I wrote a Basic and all this application software, I wrote demos, and I did all this moonlighting, all in a year.”14 That second computer was the Apple II, the machine that defined personal computing, that is on every list of the greatest computers ever made. Woz designed all the hardware and all the circuit boards and all the software that went into the Apple II, while the other Steve spewed marketing talk at potential investors and customers on the phone. Every piece and bit and byte of that computer was done by Woz, and not one bug has ever been found, “not one bug in the hardware, not one bug in the software.”15 The circuit design of the Apple II is widely considered to be astonishingly beautiful, as close to perfection as one can get in engineering. Woz did both hardware and software. Woz created a programming language in machine code.
University of Pennsylvania Almanac 42, no. 18 (1996): 4–7. Witzel, Michael. “On the Origin of the Literary Device of the Frame Story in Old Indian Literature.” In Hinduismus Und Buddhismus: Festschrift Für Ulrich Schneider, edited by Harry Falk, 380–414. Freiburg: Hedwig Falk, 1987. World Economic Forum. Global Gender Gap Report. October 23, 2012. http://www.weforum.org/issues/global-gender-gap. Wozniak, Steve. “And Then There Was Apple.” Apple II History. Accessed August 10, 2013. http://apple2history.org/museum/articles/ca8610/. Wright, Edmund, and John Daintith. A Dictionary of Computing. Online. Oxford University Press, 2008. http://www.oxfordreference.com/10.1093/acref/9780199234004.001.0001/acref-9780199234004-e-2050. Wujastyk, Dominik. “Indian Manuscripts.” In Manuscript Cultures: Mapping the Field, edited by Jörg Quenzer and Jan-Ulrich Sobisch.
Makers by Chris Anderson
3D printing, Airbnb, Any sufficiently advanced technology is indistinguishable from magic, Apple II, autonomous vehicles, barriers to entry, Buckminster Fuller, Build a better mousetrap, business process, crowdsourcing, dark matter, David Ricardo: comparative advantage, death of newspapers, dematerialisation, Elon Musk, factory automation, Firefox, future of work, global supply chain, global village, industrial robot, interchangeable parts, Internet of things, inventory management, James Hargreaves, James Watt: steam engine, Jeff Bezos, job automation, Joseph Schumpeter, Kickstarter, Lean Startup, manufacturing employment, Mark Zuckerberg, means of production, Menlo Park, Network effects, profit maximization, race to the bottom, Richard Feynman, Richard Feynman, Ronald Coase, self-driving car, side project, Silicon Valley, Silicon Valley startup, Skype, slashdot, South of Market, San Francisco, spinning jenny, Startup school, stem cell, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, supply-chain management, The Nature of the Firm, The Wealth of Nations by Adam Smith, transaction costs, trickle-down economics, Whole Earth Catalog, X Prize, Y Combinator
Writing in Wired,12 Steven Levy explained the connection, which led to the original Apple II in 1977: His dad, Paul—a machinist who had never completed high school—had set aside a section of his workbench for Steve, and taught him how to build things, disassemble them, and put them together. From neighbors who worked in the electronics firm in the Valley, he learned about that field—and also understood that things like television sets were not magical things that just showed up in one’s house, but designed objects that human beings had painstakingly created. “It gave a tremendous sense of self-confidence, that through exploration and learning one could understand seemingly very complex things in one’s environment,” he told [an] interviewer. Later, when Jobs and his Apple cofounder, Steve Wozniak, were members of the Homebrew Computer Club, they saw the potential of desktop tools—in this case the personal computer—to change not just people’s lives, but also the world.
Like IBM a generation ago, which went from corporate mainframes to personal computers, they are recognizing that their futures lie with regular folks. They are pivoting from professionals to everyone. In short, the Maker Movement has arrived. This nascent movement is less than seven years old, but it’s already accelerating as fast as the early days of the PC, where the garage tinkerers who were part of the Homebrew Computer Club in 1975 created the Apple II, the first consumer desktop computer, which led to desktop computing and the explosion of a new industry. Similarly, you can mark the beginnings of the Maker Movement with such signs as the 2005 launch of Make magazine, from O’Reilly, a legendary publisher of geek bibles, and the first Maker Faire gatherings in Silicon Valley. Another key milestone arrived with RepRap, the first open-source desktop 3-D printer, which was launched in 2007.
Indeed, in 1969 Honeywell even offered a $10,000 “kitchen computer” (official name: the “H316 Pedestal Model”), which was promoted on the cover of the Neiman-Marcus catalog to do just that—it was stylishly designed, with a built-in cutting board. (There is no evidence that any actually sold, not least because the very modern cook would have to enter data with toggle switches and read the recipes displayed in binary blinking lights.) Yet when the truly personal—“desktop”—computer did eventually arrive with the Apple II and then the IBM PC, countless uses quickly emerged, starting with the spreadsheet and word processor for business and quickly moving to entertainment with video games and communications. This was not because the wise minds of the big computer companies had finally figured out why people would want one, but because people found new uses all by themselves. Then, in 1985, Apple released the LaserWriter, the first real desktop laser printer, which, along with the Mac, started the desktop publishing phenomenon.
A Curious Mind: The Secret to a Bigger Life by Brian Grazer, Charles Fishman
4chan, Airbnb, Albert Einstein, Apple II, Asperger Syndrome, Bonfire of the Vanities, en.wikipedia.org, game design, Google Chrome, Howard Zinn, Isaac Newton, Jeff Bezos, Kickstarter, out of africa, RAND corporation, Ronald Reagan, Silicon Valley, stem cell, Steve Jobs, Steve Wozniak, the scientific method, Tim Cook: Apple
Williams: former police chief of Los Angeles Marianne Williamson: spiritual teacher, New Age guru Ian Wilmut: embryologist, led the team of researchers who first successfully cloned a mammal (a sheep named Dolly) E. O. Wilson: biologist, author, professor emeritus at Harvard University, two-time winner of the Pulitzer Prize Oprah Winfrey: founder and chairwoman of the Oprah Winfrey Network, actress, author George C. Wolfe: playwright, theater director, two-time winner of the Tony Award Steve Wozniak: cofounder of Apple Inc., designer of Apple I and Apple II computers, inventor John D. Wren: president and CEO of marketing and communications company Omnicom Will Wright: game designer, creator of Sim City and The Sims Steve Wynn: businessman, Las Vegas casino magnate Gideon Yago: writer, former correspondent for MTV News Eitan Yardeni: teacher and spiritual counselor at the Kabbalah Centre Daniel Yergin: economist, author of The Prize: The Epic Quest for Oil, Money and Power, winner of the Pulitzer Prize Dan York: chief content officer at DirecTV, former president of content and advertising sales, AT&T Michael W.
Apple II, British Empire, Claude Shannon: information theory, en.wikipedia.org, indoor plumbing, Internet Archive, Jeff Bezos, Jony Ive, Kevin Kelly, Sand Hill Road, Saturday Night Live, Silicon Valley, social web, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, technology bubble, Thomas L Friedman
Two things always seem to evoke an indignant outburst of "It ain't natural!" One is drugs; the other is technology, applied so as to please ourselves. When the latter is used to get effects as mind-blowing as the former, things become really interesting. (One of the most memorable quotes I've ever gathered in my reporting career came in 1982, covering the US Festival, a huge rock concert sponsored by Apple cofounder Steve Wozniak. At a motel nearby, Jerry Garcia, who was prepping to play a "Breakfast with the Grateful Dead" set, proclaimed, "Technology is the new drugs." Okay, not an original concept, but consider the source.) Without altering one's chemical composition, the iPod does change your head. Plugging directly into your ears, dominating the Personal brain matter in between, and shuffling your music collection to extract constant delight, it generates a portable alternative reality, almost always more pleasant than the real one.
The Sony-ites in attendance, mostly younger engineers, technical designers, and marketers, responded with panic but executed with fervor, in part because the Walkman was a product they wanted to own themselves. They made the deadline. On June 22, 1979, Sony unveiled its baby with a degree of showmanship and suspense that Steve Jobs—then a twenty-four-year-old mogul-in-training, beginning to contemplate a successor to the Apple II—might have appreciated. Journalists arriving at Sony's headquarters in the Ginza were directed to buses and handed a Walkman. Not until they arrived at Yoyogi Park—Tokyo's version of Central Park, which was filled every Sunday with crazed Elvis impersonators-were they directed to turn the devices on, to hear a sonically thrilling stereo introduction to the capabilities and virtues of the Walkman.
It's not as though Jobs doesn't get respect; in the business mags he is the equivalent of Princess Diana as a cover subject. (Jobs considers cover stories his birthright and often grants exclusive access in exchange for getting Apple's new products on the cover.) But only recently, with the dual success of the iPod and Pixar, have people come to realize that Jobs is building a historical legacy. This is a guy who has pulled of[four accomplishments that rocked the world. With the Apple II, he was instrumental in introducing the concept of a personal computer to the world. With the Macintosh, he popularized what was to become the dominant—and friendliest—means of using a computer. As the CEO of Pixar, he helped usher in the era of computer-animated feature films. And now there is the iPod. Jobs himself looks back to the Macintosh effort as a peak. Other people involved in the effort look back to that period with a Camelot-type nostalgia.
Running Money by Andy Kessler
Andy Kessler, Apple II, bioinformatics, British Empire, business intelligence, buy low sell high, call centre, Corn Laws, family office, full employment, George Gilder, happiness index / gross national happiness, interest rate swap, invisible hand, James Hargreaves, James Watt: steam engine, joint-stock company, joint-stock limited liability company, knowledge worker, Long Term Capital Management, mail merge, margin call, market bubble, Maui Hawaii, Menlo Park, Network effects, packet switching, pattern recognition, pets.com, railway mania, risk tolerance, Sand Hill Road, Silicon Valley, South China Sea, spinning jenny, Steve Jobs, Steve Wozniak, Toyota Production System
I once asked Gordon Moore about the whole Microma experience. He quickly pulled up his sleeve and pointed to a Microma watch on his wrist and told me he wore it often to remind himself to never be that stupid again. Intel’s lesson: make the intellectual property, not the end product. The cool thing about a computer on a chip is you can start a computer company without knowing much about computers. Steve Jobs and Steve Wozniak created Apple Computer without knowing that much. Wozniak had to write some software to get data on and off a ﬂoppy disk drive, which no one else had, and their Apple I became a hit. IBM knew lots about how to milk big bucks out of big computers, but nothing about microprocessors. So a stealth group in Florida contracted out the work, creating a Frankensteinlike IBM PC in 1981, using an Intel microprocessor, Microsoft software and a Western Digital disk controller.
It was lunchtime at George Gilder’s Telecosm conference, and we were waiting for the featured speaker, Gary Winnick of Global Crossing, to explain how he sends billions of packets per second under the Atlantic Ocean. George Gilder has hosted his Telecosm conference for years. Tech luminaries like Carver Mead, Bob Metcalfe and Paul Allen were regulars. “I don’t know what the ﬁrst packet was,” I confessed. My tablemate turned out to be Leonard Kleinrock, a UCLA professor, according to his name tag. It turned out that he had been at the creation. Since the 1978 introduction of the Apple II computer, to the 1981 announcement of the IBM PC, the world has been ﬂooded with smaller, cheaper and faster computers. More than 100 million new ones get sold every year. But today, these are no islands—the power of these computers is in their ability to communicate. The telephone network, which is optimized for your talks with 184 Running Money Mom, was the medium for computer communications.
See Advanced Micro Devices American Federation of Information Processing Societies, Fall Joint Computer Conference (1968), 119–20, 123 America Online. See AOL Andreessen, Marc, 197, 199 animation, 134–35 AOL (America Online), 69–73, 207, 208, 223, 290 Cisco routers and, 199 Inktomic cache software and, 143 Netscape Navigator purchase, 201, 225 Telesave deal, 72–73 TimeWarner deal, 223, 229 as top market cap company, 111 Apache Web server, 247 Apple Computer, 45, 127, 128 Apple II, 183 Applied Materials, 245 Archimedes (propeller ship), 94 Arkwright, Richard, 65 ARPANET, 186, 187, 189, 191 Arthur Andersen, 290 Artists and Repertoire (A&R), 212, 216 Asian debt crisis, 3, 150, 151, 229, 260 yen and, 162–65, 168, 292 @ (at sign), 187 AT&T, 61, 185–86, 189 August Capital, 2, 4 auto industry, 267–68 Aziz, Tariq, 26 Babbage, Charles, 93 Baker, James, 26 Balkanski, Alex, 44, 249 bandwidth, 60, 111, 121, 140, 180, 188–89 Baran, Paul, 184, 185 Barbados, 251, 254 300 Index Barksdale, Jim, 198, 199–201 Barksdale Group, 201 BASE, 249 BASIC computer language, 126, 127 BBN.
The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop
Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Duvall, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, conceptual framework, cuban missile crisis, double helix, Douglas Engelbart, Dynabook, experimental subject, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, James Watt: steam engine, Jeff Rulifson, John von Neumann, Menlo Park, New Journalism, Norbert Wiener, packet switching, pink-collar, popular electronics, RAND corporation, RFC: Request For Comment, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process
On the hardware side, this challenge was taken up most famously by the Apple Computer Company, founded in 1976 by Homebrew Computer Club members Steve Wozniak and Steve Jobs, longtime buddies from the Silicon Val- ley town of Cupertino. After some encouraging success with their first com- puter, which they marketed through local hobby shops-it was actually just a single circuit board using the new, 8-bit 6502 microprocessor from MOS Tech- nology, plus 4 kilobytes of RAM-Jobs and Wozniak were joined by the thirty- four-year-old A. C. Markkula, formerly the marketing manager for Intel. Markkula, who had retired from that company two years earlier after earning 434 THE DREAM MACHINE more than a million dollars in stock options, bought a one-third partnership in Apple for $91,000 and began working his contacts to bring in venture capital and management expertise. The result was the Apple II, a much-upgraded, 6502- based micro that was introduced in April 1977 at the first West Coast Computer Faire in San Francisco.
What could a sclerotic bureaucracy like Xerox do that his own team couldn't do better and faster? Partly, this reaction was a function of his counter- cultural disdain for large corporations in general. (The scruffily bearded Jobs wore his T-shirts, jeans, and sandals like a badge of honor.) And partly it was at- tributable to his memory of Steve Wozniak's former employer, Hewlett-Packard, which had once rebuffed Woz's proposal for a microcomputer. But in any case, Jobs changed his mind after repeated urging by Apple engineer Jef Raskin, who had joined the company to help design the Apple II. Raskin had visited PARC, as it happened, and his friends there had shown him its wonders. So on April 2, 1979,Jobs and his team met with the XDC people and struck a deal that could make sense only in the go-go world of Silicon Valley: Xerox would be allowed to invest $1.05 million in Apple's private stock sale, and in re- turn it would allow Apple full access to PARC's technology.
And perhaps most important of all, it was great for playing video games; Wozniak, the technical wizard of the team and a video-game addict himself, had designed it with precisely that use in mind. Of course, the Apple II had plenty of competition in the consumer market, notably from the Commodore PET, which debuted at the same West Coast Computer Faire, and from the Tandy-Radio Shack TRS-80, which was intro- duced the following August. In the beginning, moreover, the promises made in the Regis McKenna ad copy-"You'll be able to organize, index and store data on household finances, income taxes, recipes, your biorhythms, balance your checking account, even control your home environment"-were little more than fantasy; the applications software that would work such magic didn't exist yet. Nonetheless, the Apple II was an instant hit. By decade's end Apple itself had become one of the fastest-growing companies in American history.
The Entrepreneurial State: Debunking Public vs. Private Sector Myths by Mariana Mazzucato
Apple II, banking crisis, barriers to entry, Bretton Woods, California gold rush, call centre, carbon footprint, Carmen Reinhart, cleantech, computer age, credit crunch, David Ricardo: comparative advantage, demand response, deskilling, energy security, energy transition, eurozone crisis, everywhere but in the productivity statistics, Financial Instability Hypothesis, full employment, Growth in a Time of Debt, Hyman Minsky, incomplete markets, information retrieval, invisible hand, Joseph Schumpeter, Kenneth Rogoff, knowledge economy, knowledge worker, natural language processing, new economy, offshore financial centre, popular electronics, profit maximization, Ralph Nader, renewable energy credits, rent-seeking, ride hailing / ride sharing, risk tolerance, shareholder value, Silicon Valley, Silicon Valley ideology, smart grid, Steve Jobs, Steve Wozniak, The Wealth of Nations by Adam Smith, Tim Cook: Apple, too big to fail, total factor productivity, trickle-down economics, Washington Consensus, William Shockley: the traitorous eight
While the products owe their beautiful design and slick integration to the genius of Jobs and his large team, nearly every state-of-the-art technology found in the iPod, iPhone and iPad is an often overlooked and ignored achievement of the research efforts and funding support of the government and military. Only about a decade ago Apple was best known for its innovative personal computer design and production. Established on 1 April 1976 in Cupertino, California by Steve Jobs, Steve Wozniak and Ronald Wayne, Apple was incorporated in 1977 by Jobs and Wozniak to sell the Apple I personal computer.1 The company was originally named Apple Computer, Inc. and for 30 years focused on the production of personal computers. On 9 January 2007, the company announced it was removing the ‘Computer’ from its name, reflecting its shift in focus from personal computers to consumer electronics.
In effect, the government and business community underestimated the challenge at hand, though critics tend to focus on the failure of government and not of finance; and that (3) failure is hard to judge unless we have proper metrics to be able to understand the spillover effects that investments have, even when there is no final product. These international projects did establish networks of learning between utilities, government R&D, the business community and universities. 3 Moreover, as discussed in Chapter 5, the Apple II, which ran Kenetech’s first projects, would also not have been possible without government investments. 4 Hoffman had acquired the original Bell Labs patent through acquisition of National Fabricated Products in 1956. 5 Details on Suntech are based on a forthcoming piece of work by Matt Hopkins and Yin Li, ‘The Rise of the Chinese Solar Photovoltaic Industry and its Impact on Competition and Innovation’.
From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry by Martin Campbell-Kelly
Apple II, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, business process, card file, computer age, computer vision, continuous integration, deskilling, Grace Hopper, inventory management, John von Neumann, linear programming, Menlo Park, Network effects, popular electronics, RAND corporation, Robert X Cringely, Ronald Reagan, Silicon Valley, software patent, Steve Jobs, Steve Wozniak, Steven Levy, Thomas Kuhn: the structure of scientific revolutions
The transforming event for the personal computer was the launch of the Apple II in April 1977. The tiny firm of Apple Computer had been formed by the computer hobbyists Steve Jobs and Steve Wozniak in 1976. Their first machine, the Apple, was a raw computer board designed for kit-building hobbyists. The Apple II, however, was an unprecedented leap of imagination and packaging. Looking much like the computer terminals seen on airport reservation desks, it consisted of a keyboard, a CRT display screen, and a central processing unit, all in one package. Though Jobs was not alone in having such a vision of the personal computer, he was by far the most successful at orchestrating the technological and manufacturing resources to bring it to fruition. During 1977, the Apple II was joined by many imitators from entrepreneurial startups, and by machines from two major electronics manu- The Personal Computer Software Industry 203 facturers: the Commodore PET and the TRS-80.
The PFS:File database system was intended to occupy a perceived gap in the market for a mid-price database that would also exploit John Page’s background as the designer of the database software for the HP 3000 minicomputer. The package was developed for the Apple II, with ease of use rather than technical sophistication as its prime selling point. It was priced at $140, barely one-fifth the price of dBase II. Produced and published in house, the package was distributed directly to computer stores. All this was achieved in 1980, when the founders were still full-time employees of Hewlett-Packard. In early 1981, with the The Personal Computer Software Industry 221 success of PFS:File, they secured venture funding and incorporated as Software Publishing. By the fall of 1983, they had sold a quarter-million copies of PFS:File for Apple II, TRS-80, and IBM-compatible machines. Software Publishing continued to exploit the market niche for mid-price software with follow-on products such as PFS:Write and PFS:Graph.
In the boulders-pebbles-sand model of the software industry, the microcomputer-industry-specific application vendors were some of the finer grains of sand. Consumer Software When the Apple II was launched. in 1977, it was positioned as a “home/personal” computer. The advertising copy reflected that: The home computer that’s ready to work, play and grow with you. . . . You’ll be able to organize, index and store data on household finances, income taxes, recipes, your biorhythms, balance your checking account, even control your home environment.41 In fact, hardly any of those applications were achievable; the software did not exist. In 1977, the Apple II was too limited and too expensive for home use other than by the most dedicated enthusiast, so it was sold primarily to schools and businesses. However, during the period 1979–1981 many low-cost machines designed expressly for the domestic market were offered by Atari, Coleco, Commodore, Tandy, Texas Instruments, Timex, and Sinclair.
The Best of 2600: A Hacker Odyssey by Emmanuel Goldstein
affirmative action, Apple II, call centre, don't be evil, Firefox, game design, Hacker Ethic, hiring and firing, information retrieval, late fees, license plate recognition, optical character recognition, packet switching, pirate software, place-making, profit motive, QWERTY keyboard, RFID, Robert Hanssen: Double agent, rolodex, Ronald Reagan, Silicon Valley, Skype, spectrum auction, statistical model, Steve Jobs, Steve Wozniak, Steven Levy, Telecommunications Act of 1996, telemarketer, Y2K
I am a modern hacker, but I’ve been interested in computers since I was a child in the early 1970s when “hack” meant “create” and not the current media corruption, which essentially translates to “destroy.” 94192c15.qxd 6/4/08 3:45 AM Page 619 Still More Hacker Stories This was a time when there were no visible computers and the government still decided who had ARPANET access. Around then, the first ads started appearing for Steve Jobs’ and Steve Wozniak’s Apple II—a useful configuration cost the same as taking a family to Europe (or the United States if you’re European). A real physical computer like the ones I saw in the magazines that taught me to program were simply out of the question. My only computer was imaginary. It existed only as a simulation in my head and in my notebook—the old fashioned paper kind. My computer programs were just lists of commands and parameters on paper, much like those programs of the first hacker Alan Turing, who hand simulated the world’s first chess program in the 1940s before the computers he fathered existed.
A good part of this issue is devoted to those matters and, as a result, many articles we were planning on running were bumped to the autumn issue. It would be nice if there was substantially less of this to report for our next issue. What is the EFF? (Summer, 1990) One of the results of our public outcry over the hacker raids this spring has been the formation of the Electronic Frontier Foundation (EFF). Founded by computer industry giants Mitch Kapor and Steve Wozniak along with writer John Barlow, the EFF 501 94192c13.qxd 6/3/08 3:34 PM Page 502 502 Chapter 13 sought to put an end to raids on publishers, bulletin board operators, and all of the others that have been caught up in recent events. The EFF founders, prior to the organization’s actual birth this summer, had said they would provide financial support to those affected by unjust Secret Service raids.
I don’t envy Kevin Mitnick for the ordeal he’s endured with the government. I think of myself as lucky to have never spent a day in jail. If I had, I don’t think I would have emerged a survivor. Quite honestly, I probably wouldn’t be here today. I don’t think this mark on my record, this felony, reflects with much accuracy what kind of person I am, or what kind of employee I am. Many youths do stupid things that aren’t necessarily injurious to anyone. Before Steve Wozniak and Steve Jobs co-founded Apple Computer, they “cheated” the phone company with a device called a “blue box” while in college at Berkeley, CA. Didn’t they turn into quasi-responsible multimillionaires? 94192c15.qxd 6/4/08 3:45 AM Page 633 Still More Hacker Stories “They didn’t get caught,” a landlord said to me, whose rental operation routinely turned away convicted felons per police sponsored programs.
Rise of the Machines: A Cybernetic History by Thomas Rid
1960s counterculture, A Declaration of the Independence of Cyberspace, agricultural Revolution, Albert Einstein, Alistair Cooke, Apple II, Apple's 1984 Super Bowl advert, back-to-the-land, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, business intelligence, Claude Shannon: information theory, conceptual framework, connected car, domain-specific language, Douglas Engelbart, dumpster diving, Extropian, full employment, game design, global village, Haight Ashbury, Howard Rheingold, Jaron Lanier, job automation, John von Neumann, Kevin Kelly, Marshall McLuhan, Menlo Park, Mother of all demos, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shale / tar sands, pattern recognition, RAND corporation, Silicon Valley, Simon Singh, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Telecommunications Act of 1996, telepresence, V2 rocket, Vernor Vinge, Whole Earth Catalog, Whole Earth Review, Y2K, Yom Kippur War, Zimmermann PGP
They invented time-sharing, against the interest of large corporations, and gave more people access to SAGE-style supercomputers—in effect, turning mainframes into more widely accessible virtual personal computers. The second wave of hackers, in the late 1970s, overturned mainframes entirely by bringing the personal computer to market. Many of them were hard-core counterculture types—for instance, Steve Jobs and Steve Wozniak, two cofounders of Apple. They had honed their skills by developing, and then selling, so-called blue boxes, illegal phone phreaking devices to make free calls. Then came the third wave of “hackers,” the social hackers of the early 1980s. The personal computer and emerging network technology didn’t articulate an entire philosophy and aesthetic just by themselves. Of course, building software tools to connect and educate communities helped, and the then emerging free-software movement offered a promising platform.
For Christmas that year, Gibson finally bought an Apple II at a discount. The machine’s successor model, the Macintosh, had been launched so effectively nearly one year earlier with the legendary cyberpunk ad “1984,” but the older Apple II was still a best-selling device. When Gibson booted up the machine at home and got ready to use it, he was shocked by the computer’s mundane mechanical makeup. “Here I’d been expecting some exotic crystalline thing, a cyberspace deck or something, and what I’d gotten was something with this tiny piece of a Victorian engine in it, like an old record player.”46 The science fiction writer called up the store to complain. What was making this noise? The operator told him it was normal; the hard drive was simply spinning in the box that was the Apple II. Gibson’s ignorance about computers, he recounted, had allowed him to romanticize technology.
“Welcome to World War III, the Cybernetic War created by machines for machines.”1 The arsenals of cybernetic war were stocked with an array of fancy weaponry: cruise missiles, smart bombs, sophisticated intercontinental missiles with multiple warheads, and tools such as robotic pattern recognition, code, game theory, cryptography, and simulation. In 1979 these terms were still somewhat vague and undefined, all sounding equally futuristic. When Post’s article on the future of war came out, its analysis was at the cutting edge of technology. The US military was still reeling from defeat in Vietnam, a decidedly low-tech war. The Apple II had been released less than two years earlier, in June 1977. “What kind of man owns his own computer?” asked an Apple advertisement in Omni just after Post’s article: “Rather revolutionary, the idea of owning your own computer.”2 E-mail didn’t exist yet. Usenet, one of the world’s first computer network communication systems, had not been set up yet. CompuServe began offering a dial-up online information service to customers only four months later, in September 1979.3 Predicting ubiquitous networked computers the size of a book was a daring move.
Apple II, augmented reality, Bill Duvall, conceptual framework, Douglas Engelbart, Dynabook, experimental subject, Grace Hopper, hiring and firing, hypertext link, index card, information retrieval, invention of hypertext, Jaron Lanier, Jeff Rulifson, John von Neumann, knowledge worker, Menlo Park, Mother of all demos, new economy, Norbert Wiener, packet switching, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, RFC: Request For Comment, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, stochastic process, Ted Nelson, the medium is the message, theory of mind, Turing test, unbiased observer, Vannevar Bush, Whole Earth Catalog
(Johnson et al. 1989, 25-26) The "other" personal computer revolution that redefined the idea of the per- sonal computer was indeed partly the result of the computing philosophy that had led to the Star and that had invented a personal user for the computer. The final stages of product development and marketing of the interface for the per- sonal computer, however, occurred at Apple, not at Xerox. Apple and The End of the Bootstrapping Process The fairy-tale story of the founding of Apple Computer by Steve Jobs and Steve Wozniak, beginning with the Apple I and the meetings of computer hobbyists at the Home Brew Computer Club in a Palo Alto burger joint, is often told and need not be repeated here. It is necessary, however, to trace the path followed by Douglas Engelbart's innovations as they reached their terminus in the form in which they now are employed, a form very different from the one Engelbart had envisioned for them.
I thank all the editors of these journals for granting me the permission to reprint and update parts of these publications. 1943-1946 1958 1964 ENIAC IBM 709 IBM SABRE 1951 CDC 1604 PDP-6 1948 UNIVAC MARK Whirlwind 1959 TX-O & TX-2 1949 1952 1960 EDSAC EDVAC IBM 7090 1950 1961 1953 IBM 1730 SEAC IBM 701 & 702 PDP I 1962 1954 ATLAS IBM 704 1963 PDP8 1956 LISP 1954 FORTRAN 1960 COBOL ALGOL 1963 CTSS Sketchpad ll BASIC PASCAL 1957 FORTRAN Compiler 1947 the transistor 1958 the integrdted circuit (lC) Developments in Computer Technology, 1943 - I 964 1969 NOVA 1971 IBM 370 MAXC 1973 ALTO 1981 IBM PC OSBORNE I XEROX STAR 1975 ALTAIR IMSAI 8080 1982 LISA 1979 TRS 80 NEC 8001 1983 COMPAQ portablc 1984 MACINTOSH IBM PC AT 1977 APPLE II 1975 ALTAIR BASIC 1983 WORD 1974 INTEL X080 1973 the floppy di...k drive BRAVO 1982 MULTIPLAN LOTUS 1-2-1 ADOBE 1971 INTEL 4004 & 800X 1970 ROM IC RAM IC 1979 VISICALC WORDSTAR WORDPERFECT D-BASE 1981 HITACHI LCD MS DOS QUICKDRA W Developments In Computer Technology, 1969-1984 (Computers are shown above the line; software and components, below) INTRODUCTION Douglas Engelbart's Crusade for the Augmentation of Human Intellect Journal entry 37.
Jobs foresaw the potential of such a technology for a marketable product. Two major factors influenced the success of the technology transfer of the graphic user interface from Xerox PARC to Apple. Jobs and Wozniak were connected to the hobbyist movement of the early 1970'S, and by 1979, Apple had successfully moved from this hobbyist market to the office market, thanks to Visicalc, the first spreadsheet program developed for the Apple II. "The two Steves-Jobs and Wozniak-they understood their market. The way they un- derstood their market was twofold: I) they were it, and that's the best way to understand a market, and 2) they just liked to go to the Homebrew Computer Club and having the neatest thing. . . The neatest thing available to that com- munity" (Belleville 1992). The first factor helped Jobs to realize a potential market existed for an in- dividually owned personal computer.
A Declaration of the Independence of Cyberspace, AI winter, airport security, Apple II, artificial general intelligence, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, Bill Duvall, bioinformatics, Brewster Kahle, Burning Man, call centre, cellular automata, Chris Urmson, Claude Shannon: information theory, Clayton Christensen, clean water, cloud computing, collective bargaining, computer age, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data acquisition, Dean Kamen, deskilling, don't be evil, Douglas Engelbart, Douglas Hofstadter, Dynabook, Edward Snowden, Elon Musk, Erik Brynjolfsson, factory automation, From Mathematics to the Technologies of Life and Death, future of work, Galaxy Zoo, Google Glasses, Google X / Alphabet X, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, haute couture, hive mind, hypertext link, indoor plumbing, industrial robot, information retrieval, Internet Archive, Internet of things, invention of the wheel, Jacques de Vaucanson, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, knowledge worker, Kodak vs Instagram, labor-force participation, loose coupling, Mark Zuckerberg, Marshall McLuhan, medical residency, Menlo Park, Mother of all demos, natural language processing, new economy, Norbert Wiener, PageRank, pattern recognition, pre–internet, RAND corporation, Ray Kurzweil, Richard Stallman, Robert Gordon, Rodney Brooks, Sand Hill Road, Second Machine Age, self-driving car, semantic web, shareholder value, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, skunkworks, Skype, social software, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Nelson, telemarketer, telepresence, telepresence robot, Tenerife airport disaster, The Coming Technological Singularity, the medium is the message, Thorstein Veblen, Turing test, Vannevar Bush, Vernor Vinge, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, William Shockley: the traitorous eight
The Stanford Artificial Intelligence Laboratory quickly became a California haven for the same hacker sensibility that had spawned at MIT. Smart young computer hackers like Steve “Slug” Russell and Whitfield Diffie followed McCarthy west, and during the next decade and a half a startling array of hardware engineers and software designers would flow through the laboratory, which maintained its countercultural vibe even as McCarthy became politically more conservative. Both Steve Jobs and Steve Wozniak would hold on to sentimental memories of their visits as teenagers to the Stanford laboratory in the hills. SAIL would become a prism through which a stunning group of young technologists as well as full-blown industries would emerge. Early work in machine vision and robotics began at SAIL, and the laboratory was indisputably the birthplace of speech recognition. McCarthy gave Raj Reddy his thesis topic on speech understanding, and Reddy went on to become the seminal researcher in the field.
The idea was that unskilled users would be able to retrieve information by posing queries in normal sentences. There was no money, only a promise of stock if the project took off. Kaplan’s expertise was on natural language front ends that would allow typed questions to an expert system. What Hendrix needed, however, was a simple database back end for his demonstration. And so over a Christmas holiday at the end of 1980, Kaplan sat down and programmed one. The entire thing initially ran on an Apple II. He did it on a contingent basis and in fact he didn’t get rich. The first Symantec never went anywhere commercially and the venture capitalists did a “cram down,” a financial maneuver in which company founders often see their equity lose value in exchange for new investments. As a result, what little stock Kaplan owned was now worthless. In the end he left Stanford and joined Teknowledge because he admired Lee Hecht, the University of Chicago physicist and business school professor who had been brought in to be CEO and provide adult supervision for the twenty Stanford AI refugees who were the Teknowledge shock troops.
Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia by Anthony M. Townsend
1960s counterculture, 4chan, A Pattern Language, Airbnb, Amazon Web Services, anti-communist, Apple II, Bay Area Rapid Transit, Burning Man, business process, call centre, carbon footprint, charter city, chief data officer, clean water, cleantech, cloud computing, computer age, congestion charging, connected car, crack epidemic, crowdsourcing, DARPA: Urban Challenge, data acquisition, Deng Xiaoping, East Village, Edward Glaeser, game design, garden city movement, Geoffrey West, Santa Fe Institute, George Gilder, ghettoisation, global supply chain, Grace Hopper, Haight Ashbury, Hedy Lamarr / George Antheil, hive mind, Howard Rheingold, interchangeable parts, Internet Archive, Internet of things, Jacquard loom, Jacquard loom, Jane Jacobs, jitney, John Snow's cholera map, Khan Academy, Kibera, knowledge worker, load shedding, M-Pesa, Mark Zuckerberg, megacity, mobile money, mutually assured destruction, new economy, New Urbanism, Norbert Wiener, Occupy movement, openstreetmap, packet switching, patent troll, place-making, planetary scale, popular electronics, RFC: Request For Comment, RFID, ride hailing / ride sharing, Robert Gordon, self-driving car, sharing economy, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, smart grid, smart meter, social graph, social software, social web, special economic zone, Steve Jobs, Steve Wozniak, Stuxnet, supply-chain management, technoutopianism, Ted Kaczynski, telepresence, The Death and Life of Great American Cities, too big to fail, trade route, Tyler Cowen: Great Stagnation, Upton Sinclair, uranium enrichment, urban decay, urban planning, urban renewal, Vannevar Bush, working poor, working-age population, X Prize, Y2K, zero day, Zipcar
The Altair used the same Intel 8080 microprocessor and sold as a kit for less than $400. But you had to put the thing together yourself.12 Hobbyists quickly formed groups like Silicon Valley’s Homebrew Computer Club to trade tips, hacks, and parts for these DIY computers. Homebrew was a training camp for innovators like Apple cofounders Steve Jobs and Steve Wozniak who would overthrow IBM’s dominance of the computer industry. (According to Wozniak, the Apple I and Apple II were demo’d at Homebrew meetings repeatedly during their development.)13 Never before had so much computing power been put in the hands of so many. Grassroots smart-city technologies—mobile apps, community wireless networks, and open-source microcontrollers among them—are following a similar trajectory as the PC: from utopian idea to geek’s plaything to mass market.
Free Speech: Ten Principles for a Connected World by Timothy Garton Ash
A Declaration of the Independence of Cyberspace, Affordable Care Act / Obamacare, Andrew Keen, Apple II, Ayatollah Khomeini, battle of ideas, Berlin Wall, bitcoin, British Empire, Cass Sunstein, Chelsea Manning, citizen journalism, Clapham omnibus, colonial rule, crowdsourcing, David Attenborough, don't be evil, Edward Snowden, Etonian, European colonialism, eurozone crisis, failed state, Fall of the Berlin Wall, Ferguson, Missouri, Filter Bubble, financial independence, Firefox, Galaxy Zoo, global village, index card, Internet Archive, invention of movable type, invention of writing, Jaron Lanier, jimmy wales, Julian Assange, Mark Zuckerberg, Marshall McLuhan, megacity, mutually assured destruction, national security letter, Netflix Prize, Nicholas Carr, obamacare, Peace of Westphalia, Peter Thiel, pre–internet, profit motive, RAND corporation, Ray Kurzweil, Ronald Reagan, semantic web, Silicon Valley, Simon Singh, Snapchat, social graph, Stephen Hawking, Steve Jobs, Steve Wozniak, The Death and Life of Great American Cities, The Wisdom of Crowds, Turing test, We are Anonymous. We are Legion, WikiLeaks, World Values Survey, Yom Kippur War
The influence of a Steve Jobs or a Mark Zuckerberg on their respective empires has been more like that of an idiosyncratic absolute ruler in some mediaeval principate than that of the head of government in a modern liberal democracy. Apple’s tethered perfectionism has everything to do with Jobs’s personality. If the other Apple-founding Steve—Wozniak—had become Apple’s dominant figure, it might have remained the open, generative platform it was at the time of the 1982 Apple II desktop computer. For years, Google did not allow advertisements for cigarettes and hard liquor because Sergey Brin and Larry Page disapproved of them. Facebook’s insistence on people using their real names is, to a significant degree, a result of Zuckerberg’s personal attitude. If you have ever doubted the role of the individual in history, just look behind your screen.
3D printing, Affordable Care Act / Obamacare, airline deregulation, airport security, Apple II, barriers to entry, big-box store, blue-collar work, Capital in the Twenty-First Century by Thomas Piketty, clean water, collective bargaining, computer age, deindustrialization, Detroit bankruptcy, discovery of penicillin, Donner party, Downton Abbey, Edward Glaeser, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, feminist movement, financial innovation, full employment, George Akerlof, germ theory of disease, glass ceiling, high net worth, housing crisis, immigration reform, impulse control, income inequality, income per capita, indoor plumbing, industrial robot, inflight wifi, interchangeable parts, invention of agriculture, invention of air conditioning, invention of the telegraph, invention of the telephone, inventory management, James Watt: steam engine, Jeff Bezos, jitney, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, labor-force participation, Loma Prieta earthquake, Louis Daguerre, Louis Pasteur, low skilled workers, manufacturing employment, Mark Zuckerberg, market fragmentation, Mason jar, McMansion, Menlo Park, minimum wage unemployment, mortgage debt, mortgage tax deduction, new economy, Norbert Wiener, obamacare, occupational segregation, oil shale / tar sands, oil shock, payday loans, Peter Thiel, pink-collar, Productivity paradox, Ralph Nader, Ralph Waldo Emerson, refrigerator car, rent control, Robert X Cringely, Ronald Coase, school choice, Second Machine Age, secular stagnation, Skype, stem cell, Steve Jobs, Steve Wozniak, Steven Pinker, The Market for Lemons, Thomas Malthus, total factor productivity, transaction costs, transcontinental railway, traveling salesman, Triangle Shirtwaist Factory, Unsafe at Any Speed, Upton Sinclair, upwardly mobile, urban decay, urban planning, urban sprawl, washing machines reduced drudgery, Washington Consensus, Watson beat the top human players on Jeopardy!, We wanted flying cars, instead we got 140 characters, working poor, working-age population, Works Progress Administration, yield management
These were Paul Allen, a recent Washington State University dropout, and Bill Gates, who would use his Altair money to drop out of Harvard. In April 1975, they together founded Microsoft. A month earlier, another computer legend, Steve Wozniak, had attended the Homebrew Computer Club in California and was immediately inspired by what he saw. He began work on his own personal computer, the Apple Computer or the Apple I, and got his friend, Steve Jobs, to help with sales. The computer had more memory and a cheaper microprocessor than the Altair and could be plugged into any television to use as a screen. Soon Jobs and Wozniak began work on the Apple II, which included a keyboard and a color screen as well as an external cassette tape (soon replaced with floppy disks). It was IBM’s first personal computer (PC), introduced in 1981, that revolutionized the market and soon made the Wang minicomputer and the memory typewriter obsolete.