Filter Bubble

34 results back to index


pages: 274 words: 75,846

The Filter Bubble: What the Internet Is Hiding From You by Eli Pariser

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

A Declaration of the Independence of Cyberspace, A Pattern Language, Amazon Web Services, augmented reality, back-to-the-land, Black Swan, borderless world, Build a better mousetrap, Cass Sunstein, citizen journalism, cloud computing, cognitive dissonance, crowdsourcing, Danny Hillis, data acquisition, disintermediation, don't be evil, Filter Bubble, Flash crash, fundamental attribution error, global village, Haight Ashbury, Internet of things, Isaac Newton, Jaron Lanier, Jeff Bezos, jimmy wales, Kevin Kelly, knowledge worker, Mark Zuckerberg, Marshall McLuhan, megacity, Netflix Prize, new economy, PageRank, paypal mafia, Peter Thiel, recommendation engine, RFID, sentiment analysis, shareholder value, Silicon Valley, Silicon Valley startup, social graph, social software, social web, speech recognition, Startup school, statistical model, stem cell, Steve Jobs, Steven Levy, Stewart Brand, technoutopianism, the scientific method, urban planning, Whole Earth Catalog, WikiLeaks, Y Combinator

Together, these engines create a unique universe of information for each of us—what I’ve come to call a filter bubble—which fundamentally alters the way we encounter ideas and information. Of course, to some extent we’ve always consumed media that appealed to our interests and avocations and ignored much of the rest. But the filter bubble introduces three dynamics we’ve never dealt with before. First, you’re alone in it. A cable channel that caters to a narrow interest (say, golf) has other viewers with whom you share a frame of reference. But you’re the only person in your bubble. In an age when shared information is the bedrock of shared experience, the filter bubble is a centrifugal force, pulling us apart. Second, the filter bubble is invisible. Most viewers of conservative or liberal news sources know that they’re going to a station curated to serve a particular political viewpoint.

Though concerns about personalized media have been raised for a decade—legal scholar Cass Sunstein wrote a smart and provocative book on the topic in 2000—the theory is now rapidly becoming practice: Personalization is already much more a part of our daily experience than many of us realize. We can now begin to see how the filter bubble is actually working, where it’s falling short, and what that means for our daily lives and our society. Every technology has an interface, Stanford law professor Ryan Calo told me, a place where you end and the technology begins. And when the technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens. That’s a powerful position, Calo says. “There are lots of ways for it to skew your perception of the world.” And that’s precisely what the filter bubble does. THE FILTER BUBBLE’S costs are both personal and cultural. There are direct consequences for those of us who use personalized filters (and soon enough, most of us will, whether we realize it or not).

Personalization, in other words, may be driving us toward an Adderall society, in which hyperfocus displaces general knowledge and synthesis. Personalization can get in the way of creativity and innovation in three ways. First, the filter bubble artificially limits the size of our “solution horizon”—the mental space in which we search for solutions to problems. Second, the information environment inside the filter bubble will tend to lack some of the key traits that spur creativity. Creativity is a context-dependent trait: We’re more likely to come up with new ideas in some environments than in others; the contexts that filtering creates aren’t the ones best suited to creative thinking. Finally, the filter bubble encourages a more passive approach to acquiring information, which is at odds with the kind of exploration that leads to discovery. When your doorstep is crowded with salient content, there’s little reason to travel any farther.


pages: 270 words: 79,992

The End of Big: How the Internet Makes David the New Goliath by Nicco Mele

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, 4chan, A Declaration of the Independence of Cyberspace, Airbnb, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, Apple's 1984 Super Bowl advert, barriers to entry, Berlin Wall, big-box store, bitcoin, business climate, call centre, Cass Sunstein, centralized clearinghouse, Chelsea Manning, citizen journalism, cloud computing, collaborative consumption, collaborative editing, crony capitalism, cross-subsidies, crowdsourcing, David Brooks, death of newspapers, Donald Trump, Douglas Engelbart, en.wikipedia.org, Exxon Valdez, Fall of the Berlin Wall, Filter Bubble, Firefox, Galaxy Zoo, global supply chain, Google Chrome, Gordon Gekko, Hacker Ethic, Jaron Lanier, Jeff Bezos, jimmy wales, Julian Assange, Kevin Kelly, Khan Academy, Kickstarter, Lean Startup, Mark Zuckerberg, minimum viable product, Mohammed Bouazizi, Mother of all demos, Narrative Science, new economy, Occupy movement, Peter Thiel, pirate software, Ronald Reagan, Ronald Reagan: Tear down this wall, sharing economy, Silicon Valley, Skype, social web, Steve Jobs, Steve Wozniak, Stewart Brand, Stuxnet, Ted Nelson, Telecommunications Act of 1996, telemarketer, The Wisdom of Crowds, transaction costs, uranium enrichment, Whole Earth Catalog, WikiLeaks, Zipcar

Our former sense of citizenship, of belonging to a larger commonwealth, has given way to the “filter bubble” we now inhabit, in which our digital media sources serve up content based on what they think we want to read, creating a perverse kind of digital narcissism. Eli Pariser opens his book The Filter Bubble by describing two friends with similar demographic profiles who each google “BP” in the midst of the oil company’s disastrous Gulf of Mexico oil spill. One of his friends gets stock quotes and links to the company’s annual report; the other gets news articles about the spill and environmental activist alerts. How do we begin to get a national consensus on critical issues if everyone lives in a filter bubble of information that reinforces his or her beliefs? It’s easy to imagine two people looking up a presidential candidate online and getting radically different versions of the candidate’s bio and positions on issues.

I chuckled—it was a surprise, and I’m pretty sure my distinguished “friend” had no idea that the “Social Reader” was reporting back his every click to people like me. The Filter Bubble Entertainment consumption within our new social networking life also increasingly threatens to render us more isolated from one another. In the past, every American watched one of three television channels and read a daily newspaper, even if only for sports scores and coupons. A big, shared public sphere existed in which politicians, policy makers, leaders, and public intellectuals could argue and debate—what we commonly call the court of public opinion. By contrast, with the End of Big we inhabit a “filter bubble” in which our digital media sources—primarily Google and Facebook—serve up content based on what they think we want to read.32 Even your newsfeed on Facebook is algorithmically engineered to give you the material you’re most likely to click on, creating a perverse kind of digital narcissism, always serving you up the updates you want most.

By contrast, with the End of Big we inhabit a “filter bubble” in which our digital media sources—primarily Google and Facebook—serve up content based on what they think we want to read.32 Even your newsfeed on Facebook is algorithmically engineered to give you the material you’re most likely to click on, creating a perverse kind of digital narcissism, always serving you up the updates you want most. Nicholas Negroponte called it the “Daily Me,” but it was Eli Pariser who coined the term “filter bubble” in his book of the same name.33 The danger of the personalization of entertainment becomes clear when we remember entertainment’s traditional social functions. We need quality entertainment not just because it’s fun but also because it brings us together as a democratic society. This is one of the few areas of life where we share a common bond—and it’s rapidly going away. That’s a bigger loss than it might seem. We have frequently in our history relied upon high-quality, big productions to help us work through pressing social and political issues.


pages: 283 words: 85,824

The People's Platform: Taking Back Power and Culture in the Digital Age by Astra Taylor

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

A Declaration of the Independence of Cyberspace, Andrew Keen, barriers to entry, Berlin Wall, big-box store, Brewster Kahle, citizen journalism, cloud computing, collateralized debt obligation, Community Supported Agriculture, conceptual framework, corporate social responsibility, cross-subsidies, crowdsourcing, David Brooks, digital Maoism, disintermediation, don't be evil, Donald Trump, Edward Snowden, Fall of the Berlin Wall, Filter Bubble, future of journalism, George Gilder, Google Chrome, Google Glasses, hive mind, income inequality, informal economy, Internet Archive, Internet of things, invisible hand, Jane Jacobs, Jaron Lanier, Jeff Bezos, job automation, Julian Assange, Kevin Kelly, Kickstarter, knowledge worker, Mark Zuckerberg, means of production, Naomi Klein, Narrative Science, Network effects, new economy, New Journalism, New Urbanism, Nicholas Carr, oil rush, Peter Thiel, Plutocrats, plutocrats, pre–internet, profit motive, recommendation engine, Richard Florida, Richard Stallman, self-driving car, shareholder value, sharing economy, Silicon Valley, Silicon Valley ideology, slashdot, Slavoj Žižek, Snapchat, social graph, Steve Jobs, Stewart Brand, technoutopianism, trade route, Whole Earth Catalog, WikiLeaks, winner-take-all economy, Works Progress Administration, young professional

Concern that the Web will lead to narcissism, echo chambers, and balkanization is nothing new, but Pariser’s analysis points to something more insidious than the problem of homophily. The filters he warns about are not of our own making. We are not purposefully retreating into our own distinct worlds, becoming more insular. Instead, invisible filter bubbles are imposed on us. Online, no action goes untracked. Our prior choices are compiled, feeding the ids of what we could call algorithmic superegos—systems that determine what we see and what we don’t, channeling us toward certain choices while cutting others off. And while they may make the Internet less overwhelming, these algorithms are not neutral. “The rush to build the filter bubble is absolutely driven by commercial interests,” Pariser warns. “It’s becoming clearer and clearer that if you want to have lots of people use your Web site, you need to provide them with personally relevant information, and if you want to make the most money on ads, you need to provide them with relevant ads.”41 Ironically, what distinguishes this process from what Nicholas Negroponte enthusiastically described as “the Daily me”—the ability of individuals to customize their media diets thanks to digital technology—is that the personalization trend is not driven by individual demand but by the pursuit of profit via targeted advertising.

“If the news is important, it will find me.”4 (Notably, Bilton’s assertion was contradicted by a Harvard study that found eighteen- to twenty-nine-year-olds still prefer to get their political news from established newspapers, print or digital, than from the social media streams of their friends.)5 Theses two poles of opinion typify an ongoing debate about the way technology is transforming a younger generation’s relationship to traditional cultural forms, a debate that gets especially vehement around the question of journalism’s future—a topic with the profoundest of implications for the public sphere and health of democracy. In the popular imagination, either the Internet has freed us from the stifling grip of the old, top-down mass media model, transforming consumers into producers and putting citizens on par with the powerful, or we have stumbled into a new trap, a social media hall of mirrors made up of personalized feeds, “filter bubbles,” narcissistic chatter, and half-truths. Young people are invoked to lend credence to both views: in the first scenario, they are portrayed as empowered and agile media connoisseurs who, refusing to passively consume news products handed down from on high, insist on contributing to the conversation; in the second, they are portrayed as pliant and ill-informed, mistaking what happens to interest them for what is actually important.

While more people are coming online and more content is being uploaded, our experience of the Web is becoming increasingly personalized. New mechanisms have emerged that sift through the chaos of online content, shaping it into a targeted stream. As a consequence, our exposure to difference may actually decrease. Eli Pariser, the former executive director of MoveOn.org and founder of the viral content site Upworthy, calls this problem the “filter bubble,” a phenomenon that stems from the efforts of new-media companies to track the things we like and try to give us more of the same. These mechanisms are “prediction engines,” Pariser says, “constantly creating and refining a theory of who you are and what you’ll want next.”40 This kind of personalization is already part of our daily experience in innumerable ways. If you and I search the same category on Google, we get different results based on our search histories.


pages: 259 words: 73,193

The End of Absence: Reclaiming What We've Lost in a World of Constant Connection by Michael Harris

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

4chan, Albert Einstein, AltaVista, Andrew Keen, augmented reality, Burning Man, cognitive dissonance, crowdsourcing, dematerialisation, en.wikipedia.org, Filter Bubble, Firefox, Google Glasses, informal economy, information retrieval, invention of movable type, invention of the printing press, invisible hand, James Watt: steam engine, Jaron Lanier, jimmy wales, Kevin Kelly, Loebner Prize, Marshall McLuhan, McMansion, Nicholas Carr, pattern recognition, pre–internet, Republic of Letters, Silicon Valley, Skype, Snapchat, social web, Steve Jobs, the medium is the message, The Wisdom of Crowds, Turing test

“While our algorithms will sometimes offer music that a user has chosen in the past, we have a mandate that the site always brings forward songs you don’t know you want yet. There’s always going to be both comfort food and something surprising.” Roman’s insistence on tastemaking flies in the face of most content providers, who seek only to gratify the known desires of users. And it’s an impulse that could go a long way toward countering something that Internet activist Eli Pariser has coined “the filter bubble.” Here’s how a filter bubble works: Since 2009, Google has been anticipating the search results that you’d personally find most interesting and has been promoting those results each time you search, exposing you to a narrower and narrower vision of the universe. In 2013, Google announced that Google Maps would do the same, making it easier to find things Google thinks you’d like and harder to find things you haven’t encountered before.

A restaurateur in Ottawa’s famous ByWard Market: “Marisol Simoes Jailed: Co-owner of Kinki and Mambo in Ottawa Gets 90 Days for Defamation,” Huffington Post, accessed January 16, 2014, http://www.huffingtonpost.ca/2012/11/16/marisol-simoes-jailed_n_2146205.html. “Today’s internet is killing our culture”: Andrew Keen, The Cult of the Amateur (New York: Doubleday/Currency, 2007). “the filter bubble”: Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin Press, 2011). Google announced that Google Maps: Evegny Morozov, “My Map or Yours?,” Slate, accessed September 4, 2013, http://www.slate.com/articles/technology/future_tense/2013/05/google_maps_personalization_will_hurt_public_space_and_engagement.html. “Bullshit is unavoidable”: Harry G.

Facebook follows suit, presenting a curated view of your “friends’” activities in your feed. Eventually, the information you’re dealing with absolutely feels more personalized; it confirms your beliefs, your biases, your experiences. And it does this to the detriment of your personal evolution. Personalization—the glorification of your own taste, your own opinion—can be deadly to real learning. Only if sites like Songza continue to insist on “surprise” content will we escape the filter bubble. Praising and valuing those rare expert opinions may still be the best way to expose ourselves to the new, the adventurous, the truly revelatory. • • • • • Commensurate with the devaluing of expert opinion is the hypervaluing of amateur, public opinion—for its very amateurism. Often a comment field will be freckled with the acronym IMHO, which stands for the innocuous phrase “in my honest opinion” (or, alternatively, “in my humble opinion”).


pages: 268 words: 75,850

The Formula: How Algorithms Solve All Our Problems-And Create More by Luke Dormehl

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, algorithmic trading, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, big data - Walmart - Pop Tarts, call centre, Cass Sunstein, Clayton Christensen, computer age, death of newspapers, deferred acceptance, Edward Lorenz: Chaos theory, Erik Brynjolfsson, Filter Bubble, Flash crash, Florence Nightingale: pie chart, Frank Levy and Richard Murnane: The New Division of Labor, Google Earth, Google Glasses, High speed trading, Internet Archive, Isaac Newton, Jaron Lanier, Jeff Bezos, job automation, Kevin Kelly, Kodak vs Instagram, Marshall McLuhan, means of production, Nate Silver, natural language processing, Netflix Prize, pattern recognition, price discrimination, recommendation engine, Richard Thaler, Rosa Parks, self-driving car, sentiment analysis, Silicon Valley, Silicon Valley startup, Slavoj Žižek, social graph, speech recognition, Steve Jobs, Steven Levy, Steven Pinker, Stewart Brand, the scientific method, The Signal and the Noise by Nate Silver, upwardly mobile, Wall-E, Watson beat the top human players on Jeopardy!, Y Combinator

“I’m interested in trying to subvert all of that; removing the clutter and noise to create a more efficient way to help users gain access to things.” The problem, of course, is that in order to save you time by removing the “clutter” of the online world, Nara’s algorithms must make constant decisions on behalf of the user about what it is that they should and should not see. This effect is often called the “filter bubble.” In his book of the same title, Eli Pariser notes how two different users searching for the same thing using Google will receive very different sets of results.36 A liberal who types “BP” into his browser might get information about the April 2010 oil spill in the Gulf of Mexico, while a conservative typing the same two letters is more likely to receive investment information about the oil company.

Unlike the libertarian technologist’s pipe dream of a world that is free, flat and open to all voices, a key component of code and algorithmic culture is software’s task of sorting, classifying and creating hierarchies. Since so much of the revenue of companies like Google depends on the cognitive capital generated by users, this “software sorting” immediately does away with the idea that there is no such thing as a digital caste system. As with the “filter bubble,” it can be difficult to tell whether the endless distinctions made regarding geo-demographic profiles are helpful examples of mass customization or exclusionary examples of coded discrimination. Philosopher Félix Guattari imagined the city in which a person was free to leave their apartment, their street or their neighborhood thanks to an electronic security card that raised barriers at each intersection.

“In the past, such a notion would have been unbelievable,” Google crowed in promotional literature. “[A] map was just a map, and you got the same one for New York City, whether you were searching for the Empire State Building or the coffee shop down the street. What if, instead, you had a map that’s unique to you, always adapting to the task you want to perform right this minute?” But while this might be helpful in some senses, its intrinsic “filter bubble” effect may also result in users experiencing less of the serendipitous discovery than they would by using a traditional map. Like the algorithmic matching of a dating site, only those people and places determined on your behalf as suitable or desirable will show up.33 As such, while applying The Formula to the field of cartography might be a logical step for Google, it is potentially troubling.


pages: 327 words: 88,121

The Vanishing Neighbor: The Transformation of American Community by Marc J. Dunkelman

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Affordable Care Act / Obamacare, Albert Einstein, Berlin Wall, big-box store, blue-collar work, Bretton Woods, Broken windows theory, call centre, clean water, cuban missile crisis, dark matter, David Brooks, delayed gratification, double helix, Downton Abbey, Edward Glaeser, Fall of the Berlin Wall, Filter Bubble, Francis Fukuyama: the end of history, Gini coefficient, glass ceiling, global supply chain, global village, helicopter parent, if you build it, they will come, impulse control, income inequality, invention of movable type, Jane Jacobs, Khyber Pass, Louis Pasteur, Marshall McLuhan, Martin Wolf, McMansion, Nate Silver, Nicholas Carr, obamacare, Occupy movement, Peter Thiel, post-industrial society, Richard Florida, rolodex, Saturday Night Live, Silicon Valley, Skype, Steve Jobs, telemarketer, The Chicago School, The Death and Life of Great American Cities, the medium is the message, Thomas L Friedman, Tyler Cowen: Great Stagnation, urban decay, urban planning, Walter Mischel, War on Poverty, women in the workforce, World Values Survey

No longer do we have to trudge to the library and pull out a volume of the Reader’s Guide to Periodical Literature to find an article detailing the electoral landscape in the panhandle of Michigan; with a few keystrokes from our living room couch we can call up every article ever written on the subject. The advent of social networking has only served to press the point further. As Eli Pariser recently argued in The Filter Bubble, without our even knowing it, many software companies have found ways to predict the information we want and provide it to the exclusion of everything else.5 As a further convenience (of sorts), the previous searches need not be related; it’s now possible, for example, to associate our political sensibilities with our taste in restaurants and alcohol. As the Web site Buzzfeed highlighted during the run-up to the 2012 presidential election, the Obama and Romney campaigns were able to discern that Facebook users who “liked” Cracker Barrel’s feed were highly likely to vote for the Republican, while those who “liked” Jamba Juice were more likely to prefer the incumbent.6 The same trend can be discerned in what’s happened in the world of advertising.

Yes, there’s no guarantee that a town that incubates various interests will drive new ideas. Nevertheless, it seems less likely that networks of individuals ensconced in conversations on topics with which they agree will yield as many big breakthroughs. And that marks our central challenge. No one can claim credibly that Americans are intellectually isolated today, even if we are caught in what Eli Pariser once termed “filter bubbles.”36 What remains to be seen—and what ought to worry anyone looking at the future of economic growth—is whether the new arena for the Medici effect will, in the end, prove as effective as the old. The manner in which Americans organized themselves—what Tocqueville saw as such a crucial element of American exceptionalism—had an altogether underappreciated effect on the growth and dynamism of the American economy.

Beck, “The Buses Are Coming,” Next American City, Summer 2010. 2Charles Murray, Coming Apart: The State of White America 1960–2010. (New York: Crown Forum, 2012), 2. 3Thomas E. Mann and Norman J. Ornstein, It’s Even Worse Than It Looks: How the American Constitutional System Collided with the New Politics of Extremism (New York: Basic Books, 2012), 59. 4Bill Carter, “Prime-Time Ratings Bring Speculation of a Shift in Habits,” New York Times, April 23, 2012. 5Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011), 6–10. 6Cramer, Ruby. “2 Charts That Explain What Your Food Says About Your Politics,” Buzzfeed.com, October 31, 2012, http://www.buzzfeed.com/rubycramer/2-charts-that-explain-what-your-food-says-about-yo. 7Natasha Singer, “Your Online Attention, Bought in an Instant,” New York Times, November 17, 2012. 8Kenneth T. Jackson, Crabgrass Frontier: The Suburbanization of the United States (New York: Oxford University Press, 1985). 9Lizabeth Cohen, A Consumer’s Republic: The Politics of Mass Consumption in Postwar America (New York: Vintage Books, 2003), 288–89, 292–344. 10Chris Rock, Saturday Night Live, November 2, 1996. 11Cohen, A Consumer’s Republic, 258. 12Douglas S.


pages: 527 words: 147,690

Terms of Service: Social Media and the Price of Constant Connection by Jacob Silverman

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, 4chan, A Declaration of the Independence of Cyberspace, Airbnb, airport security, Amazon Mechanical Turk, augmented reality, Brian Krebs, California gold rush, call centre, cloud computing, cognitive dissonance, correlation does not imply causation, Credit Default Swap, crowdsourcing, don't be evil, Edward Snowden, feminist movement, Filter Bubble, Firefox, Flash crash, game design, global village, Google Chrome, Google Glasses, hive mind, income inequality, informal economy, information retrieval, Internet of things, Jaron Lanier, jimmy wales, Kevin Kelly, Kickstarter, knowledge economy, knowledge worker, late capitalism, license plate recognition, life extension, Lyft, Mark Zuckerberg, Mars Rover, Marshall McLuhan, meta analysis, meta-analysis, Minecraft, move fast and break things, national security letter, Network effects, new economy, Nicholas Carr, Occupy movement, optical character recognition, payday loans, Peter Thiel, postindustrial economy, prediction markets, pre–internet, price discrimination, price stability, profit motive, quantitative hedge fund, race to the bottom, Ray Kurzweil, recommendation engine, rent control, RFID, ride hailing / ride sharing, self-driving car, sentiment analysis, shareholder value, sharing economy, Silicon Valley, Silicon Valley ideology, Snapchat, social graph, social web, sorting algorithm, Steve Ballmer, Steve Jobs, Steven Levy, TaskRabbit, technoutopianism, telemarketer, transportation-network company, Turing test, Uber and Lyft, Uber for X, universal basic income, unpaid internship, women in the workforce, Y Combinator, Zipcar

Upworthy was founded by Eli Pariser, who previously was the executive director of liberal advocacy organization MoveOn before going on to write The Filter Bubble, a book warning that search algorithms, by taking into account our preferences, browser histories, locations, and other personal information, limit our ability to access diverse points of view. This all became richly ironic when Pariser founded Upworthy in March 2012. Upworthy is a self-proclaimed liberal site trafficking only in positive and uplifting messages; by its very construction, it’s dedicated to pushing one political point of view bound in a certain package. It’s confined itself to its own filter bubble. Many media outlets and writers have political biases, acknowledged or otherwise, but Pariser’s own history of writing about the dangers of being exposed to a narrow range of views makes this a curious enterprise, though his past career as a political operative goes some way toward explaining it.

(It’s also why some of us love seeing characters who remind us of ourselves or our friends on TV. The rise of the HBO series Girls owes much to the fact that it chronicles the lives of the same type of people who would cover it professionally—young writers and creative people living in Brooklyn.) Thisness is the flattery of representation in concentrated form. Thisness allows us to feel like pieces of media were made for us. Thisness is your own personal filter bubble, showing you exactly what you want. (It also allows BuzzFeed to divide their readers into highly specific, personalized categories, aiding in future targeting efforts.) It’s not just for entertainment. Outrage and grievance also play well, which is why talk radio and strident political blogging have boomed in the last fifteen years. It’s an equally powerful technique in advertising, but it’s more troubling when applied to journalism.

Players could also swap or share profiles. (By default, Vortex gives “Narnia” as a user’s location, though that can be adjusted as part of the game.) Although Vortex wasn’t released into the wild—it was a thesis project, it had some security holes, and browser companies would’ve been unlikely to allow it—it was social-media rebellion at its best. As Law wrote, “The ‘Internet’ does not exist. Instead, it is many overlapping filter bubbles which selectively curate us into data objects to be consumed and purchased by advertisers.” Her program, even if it never was made publicly available, brilliantly illuminated these points. It’s the kind of project that’s deeply revealing of how the surveillance economy works: by arranging us into limited categories and subgroups that can easily be managed and monitored, with our data and attention bought and sold accordingly.


pages: 390 words: 96,624

Consent of the Networked: The Worldwide Struggle for Internet Freedom by Rebecca MacKinnon

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

A Declaration of the Independence of Cyberspace, Bay Area Rapid Transit, Berlin Wall, business intelligence, Cass Sunstein, Chelsea Manning, citizen journalism, cloud computing, cognitive dissonance, collective bargaining, conceptual framework, corporate social responsibility, Deng Xiaoping, digital Maoism, don't be evil, Filter Bubble, Firefox, future of journalism, illegal immigration, Jaron Lanier, Jeff Bezos, Julian Assange, Mark Zuckerberg, Mikhail Gorbachev, national security letter, online collectivism, pre–internet, race to the bottom, Richard Stallman, Ronald Reagan, sharing economy, Silicon Valley, Silicon Valley startup, Skype, Steve Crocker, Steven Levy, WikiLeaks

Also see John Pomfret, “In China, Google Users Worry They May Lose an Engine of Progress,” Washington Post, March 20, 2010, www.washingtonpost.com/wp-dyn/content/article/2010/03/19/AR2010031900986.html (accessed June 21, 2011). 9 geopolitical vision for a digitally networked world: Eric Schmidt and Jared Cohen, “The Digital Disruption: Connectivity and the Diffusion of Power,” Foreign Affairs 89, no. 6 (November/December 2010), 75–85. 10 In his book The Filter Bubble, Eli Pariser: Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011). 10 Siva Vaidhyanathan warns: Siva Vaidhyanathan, The Googlization of Everything (And Why We Should Worry) (Berkeley: University of California Press, 2011). 10 As Harvard’s Joseph Nye points out in The Future of Power: Joseph S. Nye Jr., The Future of Power (New York: PublicAffairs, 2011). 11 Other kinds of transnational organizations are also challenging the power of nation-states: One of the earliest policy analyses of the challenge posed by transnational organizations to the power of nation-states, and how the Internet had amplified the power of new actors, was by Jessica T.

In April 2011, researchers exposed that Apple iPhones were logging and storing detailed information about users’ movements, unbeknownst to most iPhone users. (Apple later fixed what it described as a “bug” in the phone’s operating software.) Companies argue that collecting a wide array of personal data is necessary to serve people better, in ways most people have shown that they want. Critics argue that companies have gone far beyond what most citizens actually want—when they have a chance to understand what is really going on. In his book The Filter Bubble, Eli Pariser warns that search engines and social networks manipulate what we find and who we interact with on the Web in a way that maximizes our value to advertisers but that is likely to minimize the chances that we will be exposed to a sufficiently diverse range of news and views that we need as citizens to make informed political and economic choices. In The Googlization of Everything, Siva Vaidhyanathan warns that Google in particular represents a new ideology that he calls “techno-fundamentalism,” which encourages a dangerously “blind faith in technology” on the part of people who use Google services.

Buzz Cade, Marilyn Cafferty, Jack Calvert Calyx Internet Access Cameron, David Castells, Manuel Cellan-Jones, Rory Censorship of categories of Internet traffic of data and text messages freedom vs. security Google and national-level filtering systems opposition to national censorship by private intermediaries Center for Democracy and Technology Center for Information Technology Policy, Princeton Center for Internet and Society Chandler, Mark Chaos Computer Club Charter of Human Rights and Principles for the Internet Chesterman, Simon Chile, net neutrality law in China, Internet in censorship controls on political information e-commerce market “e-parliament” Google government-friendly communities “great firewall of China” paradox of patriotic hackers role of government social networking sites China Digital Times, on censorship of Google pullout China Netcom China Unicom Chinese Communist Party control of dissent by economic leadership of use of Internet by Christopher, Warren Cisco Systems, sales of surveillance systems to authoritarian governments Citizen-centric society, threats to Citizen Lab, University of Toronto Citron, Danielle “Civil Rights in an Information Age” (Citron) Cleek, Ashley Clinton, Hillary The Cluetrain Manifesto (Searls) Code and Other Laws of Cyberspace (Lessig) Cognitive Surplus (Shirky) Cohen, Jared Comcast Committee to Protect Journalists Commotion Wireless Communication Power (Castells) “Communique on Principles for Internet Policymaking” (OECD) Contemporary Business News Copyright enforcement Anti-Counterfeiting Trade Agreement (ACTA) concerns about abuse of effects of lobbying on free expression and HADOPI as rationale for surveillance and censorship website shutdowns Council of Europe Crabgrass “Creating Passionate Users” (Sierra blog) Creative Commons Cryptome Cyber-separatism Cybersecurity and Internet Freedom Act of 2011 Dailymotion Daum Deep packet inspection (DPI) Deibert, Ronald Democracy Forum Democracy in America (de Tocqueville) Democracy movements, origin of Democratizing Innovation (von Hippel) Deng Xiaoping Deng Yujiao Denial of service attacks (DDoS) Denton, Nick Department of Homeland Security (DHS) Desai, Nitin Diaspora Diebold Election Systems Digital bonapartism defined in Russia Digital commons activism and licensing need for protection of individual rights role of technical protocols of Digital Due Process (DDP) bill Digital Economy Act (2010) Digital Justice Coalition Digital Millennium Copyright Act (DMCA) Digital real estate Discipline & Punish (Foucault) Doctorow, Cory Doherty, Will Domain name system (DNS) Domini DPI (deep packet inspection) Drumbeat Drummond, David Drupal Dui Hua Foundation Dynamic Coalition on Internet Rights and Principles Dynaweb e-G8 conference ECPA (Electronics Communications Privacy Act of 1986) Egypt activism in government control of mobile phones in surveillance in use of Tor in Egyptian Blogs Aggregator 18th Brumaire of Louis Napoleon (Marx) El-Fattah, Alaa Abd El-Hamalawy, Hossam Electronic Frontier Foundation (EFF) Electronic Industry Code of Conduct Electronics Communications Privacy Act of 1986 (ECPA) Ericsson Estrada, Joseph European Digital Rights Initiative (EDRI) European Service Providers Association EveryDNS Exodus International Extractive Industries Transparency Initiative ExxonMobil Facebook activism and addition of new encryption and security settings attitude toward anonymity digital commons and effectiveness for activists in Egypt inconsistency of policy enforcement lobby to update ECPA politicians and privacy issues protection from hate and harassment “Quit Facebook Day,” real-name policy The Facebook Effect (Kirkpatrick) Facebook Zero Fair Labor Association (FLA) Falun Gong FBI (Federal Bureau of Investigation) F&C Asset Management FCC (Federal Communications Commission) Federal Trade Commission Federalist No. 10 Feriani, Samir The File (Ash) The Filter Bubble (Pariser) Financial Times, on anonymity FinFisher Fiore, Mark Firefox Flickr, removal of photos of Egyptian state security agents from Folksam Ford Motor Company Foreign Affairs on Google’s vision of networked world on US obsession with circumvention Foreign Intelligence Surveillance Act (FISA) Amendments Act (2008) Foucault, Michel Franken, Al Free Press Freedom House Freedom of Connection—Freedom of Expression: The Changing Legal and Regulatory Ecology Shaping the Internet (UNESCO) FreedomBox Freegate Frydman, Gilles “Fugitivus,” The Future of Power (Nye) Gaddafi, Muammar Gamma International UK Ltd.


pages: 322 words: 84,752

Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up by Philip N. Howard

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Affordable Care Act / Obamacare, Berlin Wall, bitcoin, blood diamonds, Bretton Woods, Brian Krebs, British Empire, call centre, Chelsea Manning, citizen journalism, clean water, cloud computing, corporate social responsibility, crowdsourcing, Edward Snowden, en.wikipedia.org, failed state, Fall of the Berlin Wall, feminist movement, Filter Bubble, Firefox, Francis Fukuyama: the end of history, Google Earth, Howard Rheingold, income inequality, informal economy, Internet of things, Julian Assange, Kibera, Kickstarter, land reform, M-Pesa, Marshall McLuhan, megacity, Mikhail Gorbachev, mobile money, Mohammed Bouazizi, national security letter, Network effects, obamacare, Occupy movement, packet switching, pension reform, prediction markets, sentiment analysis, Silicon Valley, Skype, spectrum auction, statistical model, Stuxnet, trade route, uranium enrichment, WikiLeaks, zero day

They depend on open-source software, whose performance and security can be uneven, and on free services that include product-placement advertising. They tend to be run by volunteers and strapped for cash; rarely do they have the resources to invest in good information infrastructure. The world’s authoritarian governments are better positioned than civil society groups for the internet of things. Bots and Simulations Any device network we build will create some kind of what Eli Pariser calls a filter bubble around us.31 We will be choosing which devices to connect, and those devices will both collect information about us and provide information to us. But the danger is not so much that our information supplies may be constrained by the devices we purposefully select. It is the danger that our information supplies may be manipulated by people and scripts we don’t know about. The word “botnet” comes from combining “robot” with “network,” and it describes a collection of programs that communicate across multiple devices to perform some task.

Brian Krebs, “Amnesty International Site Serving Java Exploit,” Krebs on Security, December 22, 2011, accessed September 30, 2014, http://krebsonsecurity.com/2011/12/amnesty-international-site-serving-java-exploit/. 30. @indiankanoon, “IK Servers Are Getting DDoSed Using the DNS Reflection Attack,” Indian Kanoon (October 19, 2013), accessed September 30, 2014, https://twitter.com/indiankanoon/status/391497714451492865. 31. Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (London: Penguin, 2011). 32. Keith Wagstaff, “1 in 10 Twitter Accounts Is Fake, Say Researchers,” NBC News, November 26, 2013, accessed September 30, 2014, http://www.nbcnews.com/technology/1-10-twitter-accounts-fake-say-researchers-2D11655362; Won Kim et al., “On Botnets,” in Proceedings of the 12th International Conference on Information Integration and Web-Based Applications and Services (New York: ACM, 2010), 5–10, accessed September 30, 2014, http://dl.acm.org/citation.cfm?

See also Green Movement/Revolution (Iran) computational propaganda, 205 Congo (Democratic Republic of Congo): anarchy in, 94; elections in, 117; governance in, 104–5 connective action, 149, 168–73, 175–76, 230; China’s censors blocking, 188; downsides of, 218–19; qualities of, 173–74 connective insecurity, 182 connective security, 176, 218, 230 consumer electronics: market for, 57–58; uploading usage data, 212–14 copyright infringement, 214 core, 147 corruption, 82–83, 96–97, 142–43, 216 criminal organizations, 96; dictators’ support of, 93–94; IT support for, 236; operating in place of governments, 80 crisis, information control following, 120–21 crisis mapping, 70–71, 101, 239 crowd sourcing, 21, 70, 101, 195, 295 crypto-clans, 149, 173–74, 239, 295 cryptography, 24, 237 Cuba, 92; digital dilemma in, 87; dissidence in, 62; pro-regime social media in, 201; protests in, 86; USAID involvement in, 31 cultural production, 15 cyberattacks, 38–39, 41, 190, 201–2, 295 cybercrime, 97 cyberdeterrence, 110, 148, 153–57, 233–34 cyberespionage, 40, 41, 117, 190 cybernationalism, 218 cybersecurity, partners in, 41 cyberterrorism, 43 cyberwarfare, 34, 153–54, 203, 295 dark data, 140 data: access to, xiii–xiv; aftermarket for, 245–47; beneficiaries of, 247–48; categorizing of, 141; compromised, 247; exchange value of, 55; global trade in, 181; hoarding of, 246; levels of, 141; mining of, 144, 179, 181, 245–47; monitoring systems for, 212; sharing of, 233 data shadow, 242 Datys, 215 dead pool, for dictators, 91–92 default passwords, 3 Deibert, Ron, 179 democracies: elections in, technology and, 127–28; law enforcement in, technology leaking to, 222; leaders in, susceptible to viral public outrage, 128; political independence from, 145; technology policy in, 133, 134; threats in, to internet of things, 184 democracy: xxii–xxiii, 296; adapting expectations for, 243; advocates of, keeping political memes alive, 124; bots as threat to, 208, 209–11; early stages of, 108; exporting of, 148; internet linked to, 167–68; political victories in, 128–29; predictors of, 167; technology diffusion and, 8 democratization, 50 denial-of-service attacks, 2, 4, 201, 203, 204, 208, 237 development projects, 101–2 device networks, 52; adoption patterns for, 168; battles over, 47; competition between, 162; data trails from, 114; eliminating market discrimination, 123; expansion of, botnets and, 234; filter bubbles and, 202; foreign affairs and, 249–50; governance delivery and, 100–102; governments’ attempts to control, 56–59; human security and, 175–76; impact of, on global crises, 20; linking civic groups to groups in need, 68; malware on, 113–14; market efficiency and, 158; political information and, 175; relevance of, 8; risks of, weighed against benefits, 219; spread of, 45–46, 53; standards setting for, 229; weaponizing of, 112–13 devices: betraying owners’ behavior, 227; consuming most internet bandwidth, 233; networked, attacks on, 115–16; relationships between, and the political order, 34; sharing information, 236 device tithing, 250, 296 dictators: aging of, 73, 86, 91–93, 99; changes for, 67–68, 72–73, 91; control by, 78; facing digital dilemma, 73, 86–88; hardware and software for, 162 digital activism, 86, 169, 221, 296 digital clubs, 174, 239, 296 digital cosmopolitanism, 138 digital dilemma, 73, 84–88, 151–52, 252, 296 digital exchange, 55, 56 digital images, undermining political ideology, 125 digital mapping, 139 digital media: aligned with social networks, 138; government use of, 1–2; impacts of, on international affairs, 230; learning cycles with, 155–57; as research tool, 240; as sources and conduits of power, 231 digital natives, 46–47 digital networks: bots’ domination of, 34; connections exposed by, 71–72; decentralized, 117–18; domestic political battles on, 116; moderating political opinion, 130 digital rights management, 213–14, 226–27 digital technology: flow of, 10–11; usage patterns for, 9 dirty networks, 67, 72, 177–78, 296; adapting device networks, 216; collapse of, 98–99; connections among, 97; demise of, 92; exposure of, 98; nodes in, 93–94.


pages: 371 words: 108,317

The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future by Kevin Kelly

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, A Declaration of the Independence of Cyberspace, AI winter, Airbnb, Albert Einstein, Amazon Web Services, augmented reality, bank run, barriers to entry, Baxter: Rethink Robotics, bitcoin, blockchain, book scanning, Brewster Kahle, Burning Man, cloud computing, computer age, connected car, crowdsourcing, dark matter, dematerialisation, Downton Abbey, Edward Snowden, Elon Musk, Filter Bubble, Freestyle chess, game design, Google Glasses, hive mind, Howard Rheingold, index card, indoor plumbing, industrial robot, Internet Archive, Internet of things, invention of movable type, invisible hand, Jaron Lanier, Jeff Bezos, job automation, Kevin Kelly, Kickstarter, linked data, Lyft, M-Pesa, Marshall McLuhan, means of production, megacity, Minecraft, multi-sided market, natural language processing, Netflix Prize, Network effects, new economy, Nicholas Carr, peer-to-peer lending, personalized medicine, placebo effect, planetary scale, postindustrial economy, recommendation engine, RFID, ride hailing / ride sharing, Rodney Brooks, self-driving car, sharing economy, Silicon Valley, slashdot, Snapchat, social graph, social web, software is eating the world, speech recognition, Stephen Hawking, Steven Levy, Ted Nelson, the scientific method, transport as a service, two-sided market, Uber for X, Watson beat the top human players on Jeopardy!, Whole Earth Review

engineered a way to automatically map one’s position in the field of choices visually, to make the bubble visible, which made it easier for someone to climb out of their filter bubble by making small tweaks in certain directions. Second in the ideal approach, I’d like to know what my friends like that I don’t know about. In many ways, Twitter and Facebook serve up this filter. By following your friends, you get effortless updates on the things they find cool enough to share. The ease of shouting out a recommendation via a text or photo is so easy from a phone that we are surprised when someone loves something new but doesn’t share it. But friends can also act like a filter bubble if they are too much like you. Close friends can make an echo chamber, amplifying the same choices. Studies show that going to the next circle, to friends of friends, is sometimes enough to enlarge the range of options away from the expected.

There are of course no humans involved in guiding these filters once they are operational. The cognification is based on subtle details of my (and others’) behavior that only a sleepless obsessive machine might notice. The danger of being rewarded with only what you already like, however, is that you can spin into an egotistical spiral, becoming blind to anything slightly different, even if you’d love it. This is called a filter bubble. The technical term is “overfitting.” You get stuck at a lower than optimal peak because you behave as if you have arrived at the top, ignoring the adjacent environment. There’s a lot of evidence this occurs in the political realm as well: Readers of one political stripe who depend only on a simple filter of “more like this” rarely if ever read books outside their stripe. This overfitting tends to harden their minds.

., 62 extraordinary events, 277–79 eye tracking, 219–20 Facebook and aggregated information, 147 and artificial intelligence, 32, 39, 40 and “click-dreaming,” 280 cloud of, 128, 129 and collaboration, 273 and consumer attention system, 179, 184 and creative remixing, 199, 203 face recognition of, 39, 254 and filtering systems, 170, 171 flows of posts through, 63 and future searchability, 24 and interactivity, 235 and intermediation of content, 150 and lifestreaming, 246 and likes, 140 nonhierarchical infrastructure of, 152 number of users, 143, 144 as platform ecosystem, 123 and sharing economy, 139, 144, 145 and tracking technology, 239–40 and user-generated content, 21–22, 109, 138 facial recognition, 39, 40, 43, 220, 254 fan fiction, 194, 210 fear of technology, 191 Felton, Nicholas, 239–40 Fifield, William, 288 films and film industry, 196–99, 201–2 filtering, 165–91 and advertising, 179–89 differing approaches to, 168–75 filter bubble, 170 and storage capacity, 165–67 and superabundance of choices, 167–68 and value of attention, 175–79 findability of information, 203–7 firewalls, 294 first-in-line access, 68 first-person view (FPV), 227 fitness tracking, 238, 246, 255 fixity, 78–81 Flickr, 139, 199 Flows and flowing, 61–83 and engagement of users, 81–82 and free/ubiquitous copies, 61–62, 66–68 and generative values, 68–73 move from fixity to, 78–81 in real time, 64–65 and screen culture, 88 and sharing, 8 stages of, 80–81 streaming, 66, 74–75, 82 and users’ creations, 73–74, 75–78 fluidity, 66, 79, 282 food as service (FaS), 113–14 footnotes, 201 411 information service, 285 Foursquare, 139, 246 fraud, 184 freelancers (prosumers), 113, 115, 116–17, 148, 149 Freeman, Eric, 244–45 fungibility of digital data, 195 future, blindness to, 14–22 Galaxy phones, 219 gatekeepers, 167 Gates, Bill, 135, 136 gaze tracking, 219–20 Gelernter, David, 244–46 General Electric, 160 generatives, 68–73 genetics, 69, 238, 284 Gibson, William, 214 gifs, 195 global connectivity, 275, 276, 292 gluten, 241 GM, 185 goods, fixed, 62, 65 Google AdSense ads, 179–81 and artificial intelligence, 32, 36–37, 40 book scanning projects, 208 cloud of, 128, 129 and consumer attention system, 179, 184 and coveillance, 262 and facial recognition technology, 254 and filtering systems, 172, 188 and future searchability, 24 Google Drive, 126 Google Glass, 217, 224, 247, 250 Google Now, 287 Google Photo, 43 and intellectual property law, 208–9 and lifelogging, 250–51, 254 and lifestreaming, 247–48 and photo captioning, 51 quantity of searches, 285–86 and smart technology, 223–25 translator apps of, 51 and users’ usage patterns, 21, 146–47 and virtual reality technology, 215, 216–17 and visual intelligence, 203 government, 167, 175–76, 252, 255, 261–64 GPS technology, 226, 274 graphics processing units (GPU), 38–39, 40 Greene, Alan, 31–32, 238 grocery shopping, 62, 253 Guinness Book of World Records, 278 hackers, 252 Hall, Storrs, 264–65 Halo, 227 Hammerbacher, Jeff, 280 hand motion tracking, 222 haptic feedback, 233–34 harassment, online, 264 hard singularity, 296 Harry Potter series, 204, 209–10 Hartsell, Camille, 252 hashtags, 140 Hawking, Stephen, 44 health-related websites, 179–81 health tracking, 173, 238–40, 250 heat detection, 226 hierarchies, 148–54, 289 High Fidelity, 219 Hinton, Geoff, 40 historical documents, 101 hive mind, 153, 154, 272, 281 Hockney, David, 155 Hollywood films, 196–99 holodeck simulations, 211–12 HoloLens, 216 the “holos,” 292–97 home surveillance, 253 HotWired, 18, 149, 150 humanity, defining, 48–49 hyperlinking antifacts highlighted by, 279 of books, 95, 99 of cloud data, 125–26 and creative remixing, 201–2 early theories on, 18–19, 21 and Google search engines, 146–47 IBM, 30–31, 40, 41, 128, 287 identity passwords, 220, 235 IMAX technology, 211, 217 implantable technology, 225 indexing data, 258 individualism, 271 industrialization, 49–50, 57 industrial revolution, 189 industrial robots, 52–53 information production, 257–64.


pages: 267 words: 82,580

The Dark Net by Jamie Bartlett

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, 4chan, bitcoin, blockchain, brain emulation, carbon footprint, crowdsourcing, cryptocurrency, deindustrialization, Edward Snowden, Filter Bubble, Francis Fukuyama: the end of history, global village, Google Chrome, Howard Rheingold, Internet of things, invention of writing, Johann Wolfgang von Goethe, Julian Assange, Kuwabatake Sanjuro: assassination market, life extension, litecoin, Mark Zuckerberg, Marshall McLuhan, moral hazard, Occupy movement, pre–internet, Ray Kurzweil, Satoshi Nakamoto, Skype, slashdot, technological singularity, technoutopianism, Ted Kaczynski, The Coming Technological Singularity, Turing test, Vernor Vinge, WikiLeaks, Zimmermann PGP

In January 2014, Robinson was convicted of mortgage fraud and sentenced to eighteen months in prison. At the time of writing – June 2014 – he is out on early release. p.69 ‘Creating our own realities is nothing new . . .’ The American academic Eli Pariser has documented something he calls online ‘the filter bubble’: people increasingly surround themselves with information that corroborates their own world view and reduces their exposure to conflicting information. Pariser, E., The Filter Bubble: What the Internet is Hiding From You. In the UK, we already have what is called a ‘reality–perception gap’. For example, in a 2011 survey, 62 per cent of respondents thought of ‘asylum seekers’ when asked what they associate with immigrants. In fact, asylum seekers are only 4 per cent of the immigrant population.

., Assassination Politics, http://www.outpost-of-freedom.com/jimbellap.htm. Boyd, D., It’s Complicated: The Social Lives of Networked Teens. An incredibly useful and clear-eyed account of young people’s relationship with social networks. Hafner, K. and Lyon, M., When Wizards Stay Up Late: The Origins of the Internet. Krotoski, A., Untangling the Web: What the Internet is Doing to You. Pariser, E., The Filter Bubble: What the Internet is Hiding from You. Suler, J., ‘The Online Disinhibition Effect’, in CyberPsychology and Behaviour. An extremely influential theory about what effect that communicating from behind a screen has on us. Turkle, S., The Second Self; Life On the Screen and Alone Together. Sherry Turkle is without question one of the world’s experts on this subject, and someone whose studies on the impact of computers on human behaviour and identity are required reading.


pages: 23 words: 5,264

Designing Great Data Products by Jeremy Howard, Mike Loukides, Margit Zwemer

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AltaVista, Filter Bubble, PageRank, pattern recognition, recommendation engine, self-driving car, sentiment analysis, Silicon Valley, text mining

She cut through the chaff of the obvious to make a recommendation that will send the customer home with a new book, and returning to Strand again and again in the future. This is not to say that Amazon’s recommendation engine could not have made the same connection; the problem is that this helpful recommendation will be buried far down in the recommendation feed, beneath books that have more obvious similarities to “Beloved.” The objective is to escape a recommendation filter bubble, a term which was originally coined by Eli Pariser to describe the tendency of personalized news feeds to only display articles that are blandly popular or further confirm the readers’ existing biases. As with the AltaVista-Google example, the lever a bookseller can control is the ranking of the recommendations. New data must also be collected to generate recommendations that will cause new sales.


pages: 598 words: 134,339

Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World by Bruce Schneier

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, Airbnb, airport security, AltaVista, Anne Wojcicki, augmented reality, Benjamin Mako Hill, Black Swan, Brewster Kahle, Brian Krebs, call centre, Cass Sunstein, Chelsea Manning, citizen journalism, cloud computing, congestion charging, disintermediation, Edward Snowden, experimental subject, failed state, fault tolerance, Ferguson, Missouri, Filter Bubble, Firefox, friendly fire, Google Chrome, Google Glasses, hindsight bias, informal economy, Internet Archive, Internet of things, Jacob Appelbaum, Jaron Lanier, Julian Assange, Kevin Kelly, license plate recognition, linked data, Lyft, Mark Zuckerberg, Nash equilibrium, Nate Silver, national security letter, Network effects, Occupy movement, payday loans, pre–internet, price discrimination, profit motive, race to the bottom, RAND corporation, recommendation engine, RFID, self-driving car, Silicon Valley, Skype, smart cities, smart grid, Snapchat, social graph, software as a service, South China Sea, stealth mode startup, Steven Levy, Stuxnet, TaskRabbit, telemarketer, Tim Cook: Apple, transaction costs, Uber and Lyft, urban planning, WikiLeaks, zero day

The first listing in a Google search: Chitika Online Advertising Network (7 Jun 2013), “The value of Google result positioning,” https://cdn2.hubspot.net/hub/239330/file-61331237-pdf/ChitikaInsights-ValueofGoogleResultsPositioning.pdf. the Internet you see: Joseph Turow (2013), The Daily You: How the New Advertising Industry Is Defining Your Identity and Your Worth, Yale University Press, http://yalepress.yale.edu/yupbooks/book.asp?isbn=9780300165012. the “filter bubble”: Eli Pariser (2011), The Filter Bubble: What the Internet Is Hiding from You, Penguin Books, http://www.thefilterbubble.com. on a large scale it’s harmful: Cass Sunstein (2009), Republic.com 2.0, Princeton University Press, http://press.princeton.edu/titles/8468.html. We don’t want to live: To be fair, this trend is older and more general than the Internet. Robert D. Putnam (2000), Bowling Alone: The Collapse and Revival of American Community, Simon and Schuster, http://bowlingalone.com.

Many companies manipulate what you see according to your user profile: Google search, Yahoo News, even online newspapers like the New York Times. This is a big deal. The first listing in a Google search result gets a third of the clicks, and if you’re not on the first page, you might as well not exist. The result is that the Internet you see is increasingly tailored to what your profile indicates your interests are. This leads to a phenomenon that political activist Eli Pariser has called the “filter bubble”: an Internet optimized to your preferences, where you never have to encounter an opinion you don’t agree with. You might think that’s not too bad, but on a large scale it’s harmful. We don’t want to live in a society where everybody only ever reads things that reinforce their existing opinions, where we never have spontaneous encounters that enliven, confound, confront, and teach us. In 2012, Facebook ran an experiment in control.

., 170, 183–84 Espionage Act (1917), 101 Estonia, cyberattacks on, 75, 132 Ethiopia, 73 European Charter, 169 European Court of Justice, 202, 222 European Parliament, 76 European Union (EU), 195, 200, 202, 226, 238 Charter of Fundamental Rights of, 232, 364 Data Protection Directive of, 19, 79, 80, 159, 191, 209 data retention rules in, 222 Exact Data, 42 executive branch: abuses of power by, 234–35 secrecy of, 100, 170 Executive Order 12333, 65, 173 Facebook, 58, 59, 93, 198 customer scores and, 111 data collection by, 19, 31, 41, 123, 200, 201, 204 as information middleman, 57 manipulation of posts on, 115 paid placements on, 114 real name policy of, 49 Facebook, surveillance by: data-based inferences of, 34, 258 Like button and, 48 relationship mapping by, 37–38 tagged photo database of, 41 face recognition, automatic, 27, 29, 31, 41, 211 fair information practices, 194, 211 fair lending laws, 196 false positives, 137, 138, 140, 323–24 Farrell, Henry, 60 FASCIA, 3 fatalism, mass surveillance and, 224–25 fear: government surveillance and, 4, 7, 95–97, 135, 156–57, 182–83, 222, 226, 227–30 media and, 229 politicians and, 222, 228 privacy trumped by, 228 social norms and, 227–30 Federal Bureau of Investigation (FBI): CALEA and, 83, 120 COINTELPRO program of, 103 cost to business of surveillance by, 121–22 counterterrorism as mission of, 184, 186 data mining by, 42 GPS tracking by, 26, 95 historical data stored by, 36 illegal spying by, 175 IMSI-catchers used by, 165 legitimate surveillance by, 184 Muslim Americans surveilled by, 103 PATRIOT Act and, 173–74 phone company databases demanded by, 27, 67 surveillance of all communications as goal of, 83 warrantless surveillance by, 67–68, 209 wiretapping by, 24, 27, 83, 171 Federal Communications Commission (FCC), 198 Federal Trade Commission, US (FTC), 46–47, 53, 117, 198 Feinstein, Diane, 172 Ferguson, Mo., 160 fiduciary responsibility, data collection and, 204–5 50 Cent Party, 114 FileVault, 215 filter bubble, 114–15 FinFisher, 81 First Unitarian Church of Los Angeles, 91 FISA (Foreign Intelligence Surveillance Act; 1978), 273 FISA Amendments Act (2008), 171, 273, 275–76 Section 702 of, 65–66, 173, 174–75, 261 FISA Court, 122, 171 NSA misrepresentations to, 172, 337 secret warrants of, 174, 175–76, 177 transparency needed in, 177 fishing expeditions, 92, 93 Fitbit, 16, 112 Five Eyes, 76 Flame, 72 FlashBlock, 49 flash cookies, 49 Ford Motor Company, GPS data collected by, 29 Foreign Intelligence Surveillance Act (FISA; 1978), 273 see also FISA Amendments Act Forrester Research, 122 Fortinet, 82 Fox-IT, 72 France, government surveillance in, 79 France Télécom, 79 free association, government surveillance and, 2, 39, 96 freedom, see liberty Freeh, Louis, 314 free services: overvaluing of, 50 surveillance exchanged for, 4, 49–51, 58–59, 60–61, 226, 235 free speech: as constitutional right, 189, 344 government surveillance and, 6, 94–95, 96, 97–99 Internet and, 189 frequent flyer miles, 219 Froomkin, Michael, 198 FTC, see Federal Trade Commission, US fusion centers, 69, 104 gag orders, 100, 122 Gamma Group, 81 Gandy, Oscar, 111 Gates, Bill, 128 gay rights, 97 GCHQ, see Government Communications Headquarters Geer, Dan, 205 genetic data, 36 geofencing, 39–40 geopolitical conflicts, and need for surveillance, 219–20 Georgia, Republic of, cyberattacks on, 75 Germany: Internet control and, 188 NSA surveillance of, 76, 77, 122–23, 151, 160–61, 183, 184 surveillance of citizens by, 350 US relations with, 151, 234 Ghafoor, Asim, 103 GhostNet, 72 Gill, Faisal, 103 Gmail, 31, 38, 50, 58, 219 context-sensitive advertising in, 129–30, 142–43 encryption of, 215, 216 government surveillance of, 62, 83, 148 GoldenShores Technologies, 46–47 Goldsmith, Jack, 165, 228 Google, 15, 27, 44, 48, 54, 221, 235, 272 customer loyalty to, 58 data mining by, 38 data storage capacity of, 18 government demands for data from, 208 impermissible search ad policy of, 55 increased encryption by, 208 as information middleman, 57 linked data sets of, 50 NSA hacking of, 85, 208 PageRank algorithm of, 196 paid search results on, 113–14 search data collected by, 22–23, 31, 123, 202 transparency reports of, 207 see also Gmail Google Analytics, 31, 48, 233 Google Calendar, 58 Google Docs, 58 Google Glass, 16, 27, 41 Google Plus, 50 real name policy of, 49 surveillance by, 48 Google stalking, 230 Gore, Al, 53 government: checks and balances in, 100, 175 surveillance by, see mass surveillance, government Government Accountability Office, 30 Government Communications Headquarters (GCHQ): cyberattacks by, 149 encryption programs and, 85 location data used by, 3 mass surveillance by, 69, 79, 175, 182, 234 government databases, hacking of, 73, 117, 313 GPS: automobile companies’ use of, 29–30 FBI use of, 26, 95 police use of, 26 in smart phones, 3, 14 Grayson, Alan, 172 Great Firewall (Golden Shield), 94, 95, 150–51, 187, 237 Greece, wiretapping of government cell phones in, 148 greenhouse gas emissions, 17 Greenwald, Glenn, 20 Grindr, 259 Guardian, Snowden documents published by, 20, 67, 149 habeas corpus, 229 hackers, hacking, 42–43, 71–74, 216, 313 of government databases, 73, 117, 313 by NSA, 85 privately-made technology for, 73, 81 see also cyberwarfare Hacking Team, 73, 81, 149–50 HAPPYFOOT, 3 Harris Corporation, 68 Harris Poll, 96 Hayden, Michael, 23, 147, 162 health: effect of constant surveillance on, 127 mass surveillance and, 16, 41–42 healthcare data, privacy of, 193 HelloSpy, 3, 245 Hewlett-Packard, 112 Hill, Raquel, 44 hindsight bias, 322 Hobbes, Thomas, 210 Home Depot, 110, 116 homosexuality, 97 Hoover, J.


pages: 606 words: 157,120

To Save Everything, Click Here: The Folly of Technological Solutionism by Evgeny Morozov

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, algorithmic trading, Amazon Mechanical Turk, Andrew Keen, augmented reality, Automated Insights, Berlin Wall, big data - Walmart - Pop Tarts, Buckminster Fuller, call centre, carbon footprint, Cass Sunstein, choice architecture, citizen journalism, cloud computing, cognitive bias, crowdsourcing, data acquisition, Dava Sobel, disintermediation, East Village, en.wikipedia.org, Fall of the Berlin Wall, Filter Bubble, Firefox, Francis Fukuyama: the end of history, frictionless, future of journalism, game design, Gary Taubes, Google Glasses, illegal immigration, income inequality, invention of the printing press, Jane Jacobs, Jean Tirole, Jeff Bezos, jimmy wales, Julian Assange, Kevin Kelly, Kickstarter, license plate recognition, lone genius, Louis Pasteur, Mark Zuckerberg, market fundamentalism, Marshall McLuhan, Narrative Science, Nicholas Carr, packet switching, PageRank, Paul Graham, Peter Singer: altruism, Peter Thiel, pets.com, placebo effect, pre–internet, Ray Kurzweil, recommendation engine, Richard Thaler, Ronald Coase, Rosa Parks, self-driving car, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, Slavoj Žižek, smart meter, social graph, social web, stakhanovite, Steve Jobs, Steven Levy, Stuxnet, technoutopianism, the built environment, The Chicago School, The Death and Life of Great American Cities, the medium is the message, The Nature of the Firm, the scientific method, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, Thomas L Friedman, transaction costs, urban decay, urban planning, urban sprawl, Vannevar Bush, WikiLeaks

In an ideal world, of course, we should do all of the above, but in the real world, our resources are constrained, and we need to make choices and trade-offs. For Johnson, it seems that the project of pursuing media reform through collective action happily coexists with the project of seeking better understanding of our consumption practices via self-tracking; those two seem to run on two separate tracks without much overlap. “Should corporations building personalization algorithms include mutations to break a reader’s filter bubble? . . . Absolutely. But readers should also accept responsibility for their actions and make efforts to consume a responsible, nonhomogenous [sic] diet, too,” argues Johnson. Perhaps this pervasive emphasis on personal responsibility and individual salvation is the outcome of the Protestant streak in geek mentality documented by Chris Kelty. The problem with “information diet” rhetoric is that it recasts the citizen as a passive consumer who cannot be expected to dabble in complex matters of media reform and government policy.

Such tinkering was hard to pull off with newspapers and television, because they were targeting a mass audience; with Google’s glasses and the latest e-readers, with their highly individualized approach, it finally becomes possible. By relying on nudges and other similar tricks, it might suddenly become possible to get people to pay attention to Africa or North Korea. At first, such proposals flourished in the context of increasing “serendipity”—which is believed to be under perpetual assault by digital technologies. Thus, Eli Pariser in his Filter Bubble writes that “engineers . . . can solve for serendipity by designing filtering systems . . . to expose people to topics outside their normal experience.” How exactly would it work? Pariser wants Internet companies to actively serve content that they know you haven’t consumed—but think you should. “If Amazon harbors a hunch that you’re a crime novel reader,” he writes, “it could actively present you with choices from other genres.”

., 7. 163 already employ algorithms to produce stories automatically: for more on this, see my Slate column: Evgeny Morozov, “A Robot Stole My Pulitzer,” Slate, March 19, 2012, http://www.slate.com/articles/technology/future_tense/2012/03/narrative_science_robot_journalists_customized_news_and_the_danger_to_civil_discourse_.html. 163 “I often wonder how many people”: Katy Waldman, “Popping the Myth of the Filter Bubble,” Slate, April 13, 2012, http://www.slate.com/articles/news_and_politics/intelligence_squared/2012/04/the_next_slate_intelligence_squared_debate_is_april_17_why_jacob_weisberg_rejects_the_idea_that_the_internet_is_closing_our_minds_in_politics_.single.html . 163 “beneficial inefficiency” that accompanied: David Karpf, The MoveOn Effect: The Unexpected Transformation of American Political Advocacy, 1st ed.


pages: 202 words: 59,883

Age of Context: Mobile, Sensors, Data and the Future of Privacy by Robert Scoble, Shel Israel

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Apple II, augmented reality, call centre, Chelsea Manning, cloud computing, connected car, Edward Snowden, Elon Musk, factory automation, Filter Bubble, Google Earth, Google Glasses, Internet of things, job automation, Kickstarter, Mars Rover, Menlo Park, New Urbanism, PageRank, pattern recognition, RFID, ride hailing / ride sharing, Saturday Night Live, self-driving car, sensor fusion, Silicon Valley, Skype, smart grid, social graph, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Tesla Model S, Tim Cook: Apple, urban planning, Zipcar

They don’t yet do it as effectively as we humans do it, but they do it faster and far more efficiently. There is a dark side to these growing capabilities. We should watch for the unintended consequences that always seem to accompany significant change. The potential for data abuse and the loss of privacy head the list of concerns. Eli Pariser wrote a passionate and sincere argument about the loss of privacy in his 2011 book, The Filter Bubble. Pariser took a dark view of the fact that virtually every online site collects, shares and sells user data. He talked about how large organizations use data to stereotype people and then assume they know what we want to see and hear. By getting our eyeballs to stick to their web pages they then get us to click on ads they target at us. The book gave the impression that through data, big organizations are watching us in a very Orwellian way.

Pariser scared the hell out of a lot of people who were already unsettled about this topic. He served as the prosecutor making the case against big data, and he made a good case. In fact, there is truth to what he had to say and people should consider Pariser’s perspective as they make their own decisions about what to do and not do in the Age of Context. In our view, though, Pariser presented a one-sided perspective on a multi-sided and highly granular issue. The Filter Bubble overlooked the world-improving changes that big data is making. As Neo’s Eifrem sees it, “Fundamentally, companies like Neo build hammers. You can use them to build or to smash. Yes, there will be abuses and we must be vigilant about that, but the best solution to empowering people to find and learn what they need is contained in the new databases. Big data allows everyone to easily get better results for what they are looking for through personalization of search results.”


pages: 677 words: 206,548

Future Crimes: Everything Is Connected, Everyone Is Vulnerable and What We Can Do About It by Marc Goodman

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, 3D printing, additive manufacturing, Affordable Care Act / Obamacare, Airbnb, airport security, Albert Einstein, algorithmic trading, artificial general intelligence, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, Bill Joy: nanobots, bitcoin, Black Swan, blockchain, borderless world, Brian Krebs, business process, butterfly effect, call centre, Chelsea Manning, cloud computing, cognitive dissonance, computer vision, connected car, corporate governance, crowdsourcing, cryptocurrency, data acquisition, data is the new oil, Dean Kamen, disintermediation, don't be evil, double helix, Downton Abbey, Edward Snowden, Elon Musk, Erik Brynjolfsson, Filter Bubble, Firefox, Flash crash, future of work, game design, Google Chrome, Google Earth, Google Glasses, Gordon Gekko, high net worth, High speed trading, hive mind, Howard Rheingold, hypertext link, illegal immigration, impulse control, industrial robot, Internet of things, Jaron Lanier, Jeff Bezos, job automation, John Harrison: Longitude, Jony Ive, Julian Assange, Kevin Kelly, Khan Academy, Kickstarter, knowledge worker, Kuwabatake Sanjuro: assassination market, Law of Accelerating Returns, Lean Startup, license plate recognition, litecoin, M-Pesa, Mark Zuckerberg, Marshall McLuhan, Menlo Park, mobile money, more computing power than Apollo, move fast and break things, Nate Silver, national security letter, natural language processing, obamacare, Occupy movement, Oculus Rift, offshore financial centre, optical character recognition, pattern recognition, personalized medicine, Peter H. Diamandis: Planetary Resources, Peter Thiel, pre–internet, RAND corporation, ransomware, Ray Kurzweil, refrigerator car, RFID, ride hailing / ride sharing, Rodney Brooks, Satoshi Nakamoto, Second Machine Age, security theater, self-driving car, shareholder value, Silicon Valley, Silicon Valley startup, Skype, smart cities, smart grid, smart meter, Snapchat, social graph, software as a service, speech recognition, stealth mode startup, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, supply-chain management, technological singularity, telepresence, telepresence robot, Tesla Model S, The Wisdom of Crowds, Tim Cook: Apple, trade route, uranium enrichment, Wall-E, Watson beat the top human players on Jeopardy!, Wave and Pay, We are Anonymous. We are Legion, web application, WikiLeaks, Y Combinator, zero day

Simply stated, Facebook, Google, and other Internet companies know that if they provide you the “right” stuff, you’ll spend more time on their sites and click on more links, allowing them to serve you up more ads. Facebook is by no means alone in this game, and Google too quantifies all your prior searches and, more important, what you’ve clicked on, in order to customize your online experience. In his book The Filter Bubble, the technology researcher Eli Pariser carefully documented the phenomenon. Getting you the “right” results is big business, and millions of computer algorithms are dedicated to the task. Google reportedly has at least fifty-seven separate personalization signals it tracks and considers before answering your questions, potentially to include the type of computer you are on, the browser you are using, the time of day, the resolution of your computer monitor, messages received in Gmail, videos watched on YouTube, and your physical location.

The problem with this invisible “black box” algorithmic approach to information is that we do not know what has been edited out for us and what we are not seeing. As a result, our digital lives, mediated through a sea of screens, are being actively manipulated and filtered on a daily basis in ways that are both opaque and indecipherable. This fundamental shift in the way information flows online shapes not only the way we are informed but the way we view the world. Most of us are living in filter bubbles, and we don’t even realize it. Around the world, nations are increasingly deciding what data citizens should be able to access and what information should be prohibited. Using compelling arguments such as “protecting national security,” “ensuring intellectual property rights,” “preserving religious values,” and the perennial favorite, “saving the children,” governments are ever expanding their national firewalls for the purpose of Internet censorship.

With each successive generation, we grow deeply comfortable, even if only unconsciously, with blindly following the directions provided to us by machines. Garbage in, garbage out has been supplanted by garbage in, gospel out: if the computer says so, it must be true. The problem with such reasoning is that we as a society are relying on incorrect data all the time, a festering problem that will come back to bite us. Filter bubbles, invisible search engine censorship, national firewalls, and faulty data mean we have a fundamental integrity problem with the way we see the world, or more precisely with the way the world is presented to us, mediated through our screens. When Seeing Ain’t Believing In the preceding chapters, we focused extensively on what happens when your data leak and your information confidentiality is breached.


pages: 88 words: 22,980

One Way Forward: The Outsider's Guide to Fixing the Republic by Lawrence Lessig

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

collapse of Lehman Brothers, crony capitalism, crowdsourcing, en.wikipedia.org, Filter Bubble, jimmy wales, Occupy movement, Ronald Reagan

And nothing could have made MSNBC or Comedy Central happier, since, having lost George Bush as a target, they could now shift to a network—Fox News—as a target. And so, too, with us. We all have joined our i-enabled organization of choice: the Tea Party or MoveOn, Drudge or Huffington Post. We all get our daily fix of fury, from e-mail lists or podcasts, from news sites or blogs. We tune in to the message we want. We tune out the message we can’t stomach. Indeed, as Eli Pariser so powerfully demonstrates in his 2011 book The Filter Bubble, the machines themselves help us tune out. There’s no such thing as “a Google search”; there’s only “my searches on Google.” Google remembers the sort of stuff I’m interested in. Those interests help determine the search results that Google gives me. And thus are my search results different from yours: once again, the business model of polarization, made perfect by the amazing Google. But so what?


pages: 94 words: 26,453

The End of Nice: How to Be Human in a World Run by Robots (Kindle Single) by Richard Newton

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Black Swan, British Empire, Buckminster Fuller, Clayton Christensen, crowdsourcing, deliberate practice, fear of failure, Filter Bubble, future of work, Google Glasses, Isaac Newton, James Dyson, Jaron Lanier, Jeff Bezos, job automation, Lean Startup, low skilled workers, Mark Zuckerberg, move fast and break things, Paul Erdős, Paul Graham, recommendation engine, rising living standards, Robert Shiller, Robert Shiller, Silicon Valley, Silicon Valley startup, skunkworks, Steve Ballmer, Steve Jobs, Y Combinator

But search results are also tailored to you. And that’s more of a concern. The search results you get will be different to the results for an identical search made by me. In fact, so much insight can be derived from your online behaviour that Google and other organisations can ensure you get news that makes you happy… or even angry the way you like to be angry. It’s a process described by Eli Pariser in his book The Filter Bubble: “When technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens.” Pariser wrote this following Google’s decision, in December 2009, to begin customising its search results for each user. Instead of giving you the most broadly popular result, Google would try to predict what you are most likely to click on. In an algorithmically customised world, we will increasingly be categorised, sliced and diced and prompted and fed only news that is pleasant, familiar, and confirms our beliefs.


pages: 320 words: 87,853

The Black Box Society: The Secret Algorithms That Control Money and Information by Frank Pasquale

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Affordable Care Act / Obamacare, algorithmic trading, Amazon Mechanical Turk, asset-backed security, Atul Gawande, bank run, barriers to entry, Berlin Wall, Bernie Madoff, Black Swan, bonus culture, Brian Krebs, call centre, Capital in the Twenty-First Century by Thomas Piketty, Chelsea Manning, cloud computing, collateralized debt obligation, corporate governance, Credit Default Swap, credit default swaps / collateralized debt obligations, crowdsourcing, cryptocurrency, Debian, don't be evil, Edward Snowden, en.wikipedia.org, Fall of the Berlin Wall, Filter Bubble, financial innovation, Flash crash, full employment, Goldman Sachs: Vampire Squid, Google Earth, Hernando de Soto, High speed trading, hiring and firing, housing crisis, informal economy, information retrieval, interest rate swap, Internet of things, invisible hand, Jaron Lanier, Jeff Bezos, job automation, Julian Assange, Kevin Kelly, knowledge worker, Kodak vs Instagram, kremlinology, late fees, London Interbank Offered Rate, London Whale, Mark Zuckerberg, mobile money, moral hazard, new economy, Nicholas Carr, offshore financial centre, PageRank, pattern recognition, precariat, profit maximization, profit motive, quantitative easing, race to the bottom, recommendation engine, regulatory arbitrage, risk-adjusted returns, search engine result page, shareholder value, Silicon Valley, Snapchat, Spread Networks laid a new fibre optics cable between New York and Chicago, statistical arbitrage, statistical model, Steven Levy, the scientific method, too big to fail, transaction costs, two-sided market, universal basic income, Upton Sinclair, value at risk, WikiLeaks

Multiply that experience by years of people, e-mail, and search—that’s how powerful the dominant platforms really are as artificial intelligence aids for virtually any tasks we undertake.120 They have unmatched abilities to advance users’ data-dependent interests. But personalization has unnerving effects, too. Google results have become so very particular that it is increasingly difficult to assess how much of any given subject or controversy any of us actually sees. We see what we have trained Google to show us and what Google gradually conditions us to expect. Entrepreneur Eli Pariser calls this phenomenon “the filter bubble” and worries that all this personalization has serious side effects, namely increased insularity and reinforced prejudice.121 So intense is the personalization of search results, for instance, that when British Petroleum’s (BP) massive oil spill was dominating cable news in the summer of 2010, searches for “BP” on Google led some users to fierce denunciations of the company’s environmental track record, and others to  investment opportunities in the company.122 Only the search engineers at the Googleplex can reliably track who’s seeing what and why.

Pam Dixon and Robert Gellman, The Scoring of America (San Diego: World Privacy Forum, 2014). 47. Frank A. Pasquale and Tara Adams Ragone, “The Future of HIPAA in the Cloud,” Stanford Technology Law Review (forthcoming 2014). Available at http://papers.ssrn.com /sol3/papers.cfm?abstract _id=2298158. 48. A company called Acxiom has 1,600 pieces of information about 98 percent of U.S. adults, gathered from thousands of sources. Eli Pariser, The Filter Bubble (New York: Penguin, 2011), 3. At least some of them are healthindicative or health-predictive. Daniel J. Solove, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (New Haven, CT: Yale University Press, 2008); Natasha Singer, “You for Sale: Mapping the Consumer Genome,” New York Times, June 16, 2012; Nicolas P. Terry, “Protecting Patient Privacy in the Age of Big Data,” UMKC Law Review 81 (2012): 385–416. 49.

., Digital Labor: The Internet as Playground and Factory (New York: Routledge, 2013); Jaron Lanier, Who Owns the Future? (New York: Simon & Schuster, 2013); Jessica Weisberg, “Should Facebook Pay Its Users?,” The Nation, January 14, 2014 NOTES TO PAGES 79–81 257 (quoting manifesto “WE WANT TO CALL WORK WHAT IS WORK SO THAT EVENTUALLY WE MIGHT REDISCOVER WHAT FRIENDSHIP IS”). 121. Eli Pariser, The Filter Bubble (New York: Penguin, 2011). 122. Ibid., 6–7. 123. Fortunately, one has written a work of fiction to suggest what could go wrong. Shumeet Baluja, The Silicon Jungle: A Novel of Deception, Power, and Internet Intrigue (Princeton, NJ: Princeton University Press, 2011). 124. Cathy O’Neil, “When Accurate Modeling Is Not Good,” Mathbabe (blog), December 12, 2012, http://mathbabe.org/2012/12/12/when-accurate -modeling-is-not-good/ (analyzing the work of a casino CEO concerned with predictive analytics). 125.


pages: 510 words: 120,048

Who Owns the Future? by Jaron Lanier

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, 4chan, Affordable Care Act / Obamacare, Airbnb, augmented reality, automated trading system, barriers to entry, bitcoin, book scanning, Burning Man, call centre, carbon footprint, cloud computing, computer age, crowdsourcing, David Brooks, David Graeber, delayed gratification, digital Maoism, en.wikipedia.org, facts on the ground, Filter Bubble, financial deregulation, Fractional reserve banking, Francis Fukuyama: the end of history, George Akerlof, global supply chain, global village, Haight Ashbury, hive mind, if you build it, they will come, income inequality, informal economy, invisible hand, Jacquard loom, Jaron Lanier, Jeff Bezos, job automation, Kevin Kelly, Khan Academy, Kickstarter, Kodak vs Instagram, life extension, Long Term Capital Management, Mark Zuckerberg, meta analysis, meta-analysis, moral hazard, mutually assured destruction, Network effects, new economy, Norbert Wiener, obamacare, packet switching, Peter Thiel, place-making, Plutocrats, plutocrats, Ponzi scheme, post-oil, pre–internet, race to the bottom, Ray Kurzweil, rent-seeking, reversible computing, Richard Feynman, Richard Feynman, Ronald Reagan, self-driving car, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, smart meter, stem cell, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, The Market for Lemons, Thomas Malthus, too big to fail, trickle-down economics, Turing test, Vannevar Bush, WikiLeaks

These days, we wait for unpaid partisan crowds to pore through a controversial speech to document misleading mash-ups. Bloggers will notice when a candidate is quoted out of context in a campaign commercial. Similarly, journalists will eventually notice when inflammatory anti-Islamic videos have been faked and dubbed. That is not an entirely dysfunctional means of making up for lost context, but it does mean that corrections and context are trapped within online “filter bubbles.” It is not a given that those who might be predisposed to believe in a deceptive mash-up’s point of view will be exposed to a factual correction about what was mashed. Of course, there’s no guarantee that a person who wants to believe in an idea would actually follow the link to see if a mash-up was deceptive, but at least the link would be right there in front of them. If you doubt the importance of that small change, just look at Google’s revenues, which are almost entirely based on putting links immediately in front of people.

• There will be much more information available in some semblance of book form than ever before, but overall a lower quality standard. • A book won’t necessarily be the same for each person who reads it or if the same person reads it twice. On the one hand this will mean better updates for some kinds of information and fewer encounters with typos, but on the other will deemphasize the rhythm and poetics of prose, minimize the stakes of declaring a manuscript complete, and expand the “filter bubble” effect. • The means to find reading material will be where business battles are fought. The fights often will not be pretty. The interface between readers and books will be contested and often corrupted by spam and deception. • Writing a book won’t mean as much. Some will think of this as a democratic, antielitist benefit, and others will think of it as a lowering of standards. • Readers will spend a lot of time hassling with forgotten passwords, expired credit cards, and being locked into the wrong device or mobile-service contract for years at a time.

., 49, 74–75 entertainment industry, 7, 66, 109, 120, 135, 136, 185–86, 258, 260 see also mass media entrepreneurship, 14, 57, 79, 82, 100–106, 116, 117–20, 122, 128, 148–49, 166, 167, 183, 200, 234, 241–43, 248, 274, 326, 359 entropy, 55–56, 143, 183–84 environmental issues, 32 equilibrium, 148–51 Erlich, Paul, 132 est, 214 Ethernet, 229 Etsy, 343 Europe, 45, 54, 77, 199 evolution, 131, 137–38, 144, 146–47 exclusion principle, 181, 202 Expedia, 65 experiments, scientific, 112 experts, 88, 94–95, 124, 133–34, 178, 325–31, 341, 342 externalization, 59n Facebook, 2, 8, 14, 20, 56–57, 93, 109, 154, 169, 171, 174, 180, 181, 188, 190–91, 200n, 204, 206, 207, 209, 210, 214, 215, 217, 227, 242–43, 246, 248, 249, 251, 270, 280, 286, 306, 309, 310, 313, 314, 317, 318, 322, 326, 329, 341, 343, 344, 346, 347–48, 366 facial recognition, 305n, 309–10 factories, 43, 85–86, 88, 135 famine, 17, 132 Fannie Mae, 69 fascism, 159–60 fashion, 89, 260 feedback, 112, 162, 169, 203, 298, 301–3, 363–64, 365 fees, service, 81, 82 feudalism, 79 Feynman, Richard, 94 file sharing, 50–52, 61, 74, 78, 88, 100, 223–30, 239–40, 253–64, 277, 317–24, 335, 349 “filter bubbles,” 225, 357 filters, 119–20, 200, 225, 356–57 financial crisis (2008), 76–77, 115, 148n financial services, 7n, 29–31, 35, 38, 45, 49, 50, 52, 54, 56–67, 69–70, 74–80, 82, 115, 116–20, 148n, 153–54, 155, 179–85, 200, 208, 218, 254, 257, 258, 277–78, 298, 299–300, 301, 336–37, 344–45, 348, 350 firewalls, 305 first-class economic citizens, 246, 247, 248–51, 273, 286–87, 323, 349, 355–56 Flightfox, 64 fluctuations, 76–78 flu outbreaks, 110, 120 fMRI, 111–12 food supplies, 17, 123, 131 “Fool on the Hill, The,” 213 Ford, Henry, 43 Ford, Martin, 56n Forster, E.

Common Knowledge?: An Ethnography of Wikipedia by Dariusz Jemielniak

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Andrew Keen, barriers to entry, citation needed, collaborative consumption, collaborative editing, conceptual framework, continuous integration, crowdsourcing, Debian, deskilling, digital Maoism, en.wikipedia.org, Filter Bubble, Google Glasses, Hacker Ethic, hive mind, Internet Archive, invisible hand, Jaron Lanier, jimmy wales, job satisfaction, Julian Assange, knowledge economy, knowledge worker, Menlo Park, moral hazard, online collectivism, pirate software, RFC: Request For Comment, Richard Stallman, Silicon Valley, Skype, slashdot, social software, Stewart Brand, The Nature of the Firm, The Wisdom of Crowds, transaction costs, WikiLeaks, wikimedia commons

Yet the thesis that the open-collaboration phenomenon leads univocally and definitely to liberating consumers from traditional neoliberal institutions and economics seems risky. Moreover, the theoretical democratization of knowledge production may be simply a reenactment of the established system (König, 2012), as discussed in Chapter 2. As Eli E. Pariser’s recent work convincingly shows, the free access to information may just as well be threatened by “filter bubbles” (2011) and corporate monopolization of knowledge, not only supporting the old establishment but also adding new layers to it. Wikipedia seems to be, willingly or not, in the middle of a major ideological clash: Today powerful and highly profitable corporations such as Microsoft and Google are battling for a greater presence and power on the internet. However, the orientation of capitalism and its goals—especially ever-increasing profits—are in conflict with the cyber-libertarianism that remains a strong presence online on sites such as Wikipedia, Linux, and Creative Commons communities.

Getting the seats of your pants dirty: Strategies for ethnographic research on virtual communities. Journal of Computer-Mediated Communication, 3(1). doi:10.1111/j.1083-6101.1997.tb00065.x Panciera, K., Halfaker, A., & Terveen, L. (2009). Wikipedians are born, not made: A study of power editors on Wikipedia. Paper presented at the GROUP ’09 Proceedings of the ACM 2009 International conference on Supporting Group Work, New York. Pariser, E. E. (2011). The filter bubble: What the Internet is hiding from you. New York: Penguin Press. Parvaz, D. (2011, January 15). Look it up: Wikipedia turns 10. AlJazeera. Retrieved from http://www.aljazeera.com/indepth/features/2011/01/201111571716655385.html Pegg, D., & Wright, D. (2011, December 8). Wikipedia founder attacks Bell Pottinger for “ethical blindness.” The Independent. Retrieved from http://www.independent .co.uk/news/uk/politics/wikipedia-founder-attacks-bell-pottinger-for-ethical -blindness-6273836.html Pentzold, C. (2011).

See also Gdańsk/Danzig edit war; RfAs (requests for adminship); Wales, Jimmy “Jimbo” entry barriers for new users, 101 EQ (rules of etiquette), 18 Errant (user), 44–45 Essjay controversy: and apology to Wales, 111; final results of, 117; legalistic solution to, 123; and nature of Essjay’s wrongdoing, 113–114; and resignation from Wikipedia positions, 113; and use of false “authority,” 109–112, 114 ethical breaching experiments, 164–167 ethnographic project, this book as, 193–194 etiquette, 18, 94 exclusionism, 23 Facebook, 95–96, 172 faceless, Wikipedia as, 183 face-to-face contact, online communities with and without, 77 facilitator role in conflict resolution and consensus, 82–83 false personae, 111, 113–114, 117–118 Faraj, Samer, 59–60, 85 FDC (Funds Dissemination Committee), 131–132, 197 featured-article designation, 24 Fetchcomms (user), 46 filibustering, 20 Filipacchi, Amanda, 16 “filter bubbles,” 189 “filter then publish” or vice versa, 183 five pillars, 96, 98 flagged revisions, 136 F/LOSS (free/libre and open-sourcesoftware) movement, 2; and copyright laws, 22; as created by software professionals, 107; forking in, 145; gender gap in, 16; leadership philosophy of, 174–176; patterns of partipation within, 39; as supporting Internet freedom and opposing censorship, 141; voluntary work and immaterial labor in, 230n7 FORG (forgive and forget) norm, 18 forking, 126, 133, 144–148, 179 formalization of rules, 120–121, 124, 151, 174 formal mediation, 61 fossilized procedures, 121 Foucault, Michel, 54 founder role in organizational development, 154–155, 174, 178–179 “founder’s seat” on board of trustees, 129 founding principles, 162 FoxNews.com, 167, 171 2 8 6    I n d e x freedom of information debates, 151 French Wikipedia, 12, 15, 77, 146 fund-raising and distribution, 130, 131–133 Funds Dissemination Committee (FDC), 131–132, 197 Future Search, 64 GAME (gaming the system) rule, 20 Ganga/Ganges issue, 76 Gardner, Sue: on controversial-content filtering, 145–146; and Quaker consensus model, 62; on Wales and Wikiversity issue, 165–166; on WMF accountability, 132; as WMF executive director, 129, 154–155 gatekeeping, 14, 17 Gdansk (user), 68, 73 Gdańsk/Danzig edit war, 64; active parties in, 67–68; beginning of, 65–67; called lame, 71; community attempts to resolve, 68–70; escalation of, 67–73; inability of consensus to end, 59, 70–71; mediation request in, 71; and peace without consensus, 74–76, 78; and stalemate, 80; votes during, 73, 74–75 “geeks,” 188 Geertz, Clifford, 195, 196 gender: attempts to address gender gap, 229n8; of editors, 14–16, 191, 229n8, 231–232n12; effects of gender bias, 16, 77; Homopedia, 5; and self-disclosure as optional online, 25, 117, 199 German Wikipedia, 11, 12, 15, 146, 234n8 Germany.


pages: 743 words: 201,651

Free Speech: Ten Principles for a Connected World by Timothy Garton Ash

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

A Declaration of the Independence of Cyberspace, Affordable Care Act / Obamacare, Andrew Keen, Apple II, Ayatollah Khomeini, battle of ideas, Berlin Wall, bitcoin, British Empire, Cass Sunstein, Chelsea Manning, citizen journalism, Clapham omnibus, colonial rule, crowdsourcing, David Attenborough, don't be evil, Edward Snowden, Etonian, European colonialism, eurozone crisis, failed state, Fall of the Berlin Wall, Ferguson, Missouri, Filter Bubble, financial independence, Firefox, Galaxy Zoo, global village, index card, Internet Archive, invention of movable type, invention of writing, Jaron Lanier, jimmy wales, Julian Assange, Mark Zuckerberg, Marshall McLuhan, megacity, mutually assured destruction, national security letter, Netflix Prize, Nicholas Carr, obamacare, Peace of Westphalia, Peter Thiel, pre–internet, profit motive, RAND corporation, Ray Kurzweil, Ronald Reagan, semantic web, Silicon Valley, Simon Singh, Snapchat, social graph, Stephen Hawking, Steve Jobs, Steve Wozniak, The Death and Life of Great American Cities, The Wisdom of Crowds, Turing test, We are Anonymous. We are Legion, WikiLeaks, World Values Survey, Yom Kippur War

If the three philosophers did not already have their heads in different clouds, they soon would have.145 The situation is constantly evolving but, at this writing, Google will customise your search results on the basis of your location and—if you are logged in rather than actively opting to search anonymously—your personal search history, as well as information drawn from those of your email accounts and social networks to which it has access. The last is the ‘social’ component of customisation, including what your friends are interested in. If you and I search for exactly the same term, we will get different results. And if we are not careful, we will each hive off into our own individual ‘filter bubble’.146 More broadly, across the internet there is a risk of fragmentation into thousands of tiny ‘information cocoons’: echo chambers where the news and opinions we see are only those favoured by the like-minded and our only newspaper is the Daily Me. At the extreme, you have the Norwegian mass murderer Anders Behring Breivik, whose anti-Muslim fury was reinforced by constantly revisiting a handful of hysterical sites about the threats Islam and multiculturalism posed to Europe and by his own tiny crowd of online correspondents.

A 15-year-old British schoolgirl who flew to join the Islamic State terrorist organisation in Syria had, it turned out, been following 74 radical and fundamentalist Islamist Twitter accounts.16 Unlike in the physical world, the internet makes it easy for the conspiracy theorist to find the 957 other people across the planet who share his or her particular poisoned fantasy. The increasingly personalised nature of internet searches on Google and other search engines can exacerbate the problem, with everyone disappearing into his or her own ‘filter bubble’.17 This is a much broader problem, to which we will return, but it clearly affects the analysis of speech and violence. Thus, for example, the Norwegian mass murderer Anders Behring Breivik was reinforced in his paranoid views by obsessive reading of anti-Islamic and anti-multiculturalist websites such as Pamela Geller’s ‘Atlas Shrugs’ and Robert Spencer’s ‘Gates of Vienna’, from which he quoted in his online ‘crusader’ manifesto.18 Does that mean such sites should be blocked and their content censored?

If you are logged in as a Google user, your results will be customised not just on the basis of your location and previous search history but also on information Google has collected from your Gmail accounts, your use of its social networks and any other online source to which it has access. The most alarming aspect of this is the threat to our privacy, which I discuss more in chapter 7, but it can also impair our pursuit of knowledge. If the effect of search personalisation is to give a higher ranking to sites we and our online contacts have previously viewed, then we are in danger of being hived off into ‘filter bubbles’ of the like-minded. Google will reply that it is just giving people what they want—a more personalised, customised service. But that is only half the story. The other half is that Google is giving advertisers what they want: the capacity to target individual consumers ever more precisely. If information is power, personalised information is also money. Google Search is a fantastic, transformative tool in the quest for knowledge.


pages: 361 words: 81,068

The Internet Is Not the Answer by Andrew Keen

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, A Declaration of the Independence of Cyberspace, Airbnb, AltaVista, Andrew Keen, augmented reality, Bay Area Rapid Transit, Berlin Wall, bitcoin, Black Swan, Burning Man, Cass Sunstein, citizen journalism, Clayton Christensen, clean water, cloud computing, collective bargaining, Colonization of Mars, computer age, connected car, cuban missile crisis, David Brooks, disintermediation, Downton Abbey, Edward Snowden, Elon Musk, Erik Brynjolfsson, Fall of the Berlin Wall, Filter Bubble, Francis Fukuyama: the end of history, Frank Gehry, Frederick Winslow Taylor, frictionless, full employment, future of work, gig economy, global village, Google bus, Google Glasses, Hacker Ethic, happiness index / gross national happiness, income inequality, index card, informal economy, information trail, Innovator's Dilemma, Internet of things, Isaac Newton, Jaron Lanier, Jeff Bezos, job automation, Joseph Schumpeter, Julian Assange, Kevin Kelly, Kickstarter, Kodak vs Instagram, Lean Startup, libertarian paternalism, Lyft, Mark Zuckerberg, Marshall McLuhan, Martin Wolf, move fast and break things, Nate Silver, Network effects, new economy, Nicholas Carr, nonsequential writing, Norbert Wiener, Occupy movement, packet switching, PageRank, Paul Graham, Peter Thiel, Plutocrats, plutocrats, Potemkin village, precariat, pre–internet, RAND corporation, Ray Kurzweil, ride hailing / ride sharing, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Skype, smart cities, Snapchat, social web, South of Market, San Francisco, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, TaskRabbit, Ted Nelson, telemarketer, the medium is the message, Thomas L Friedman, Tyler Cowen: Great Stagnation, Uber for X, urban planning, Vannevar Bush, Whole Earth Catalog, WikiLeaks, winner-take-all economy, working poor, Y Combinator

But the personal revolution is certainly making us more parochial and unworldly. Just as Instagram enables us to take photos that are dishonest advertisements for ourselves, so search engines like Google provide us with links to sites tailored to confirm our own mostly ill-informed views about the world. Eli Pariser, the former president of MoveOn.org, describes the echo-chamber effect of personalized algorithms as “The Filter Bubble.”41 The Internet might be a village, Pariser says, but there’s nothing global about it. This is confirmed by a 2013 study by the Massachusetts Institute of Technology showing that the vast majority of Internet and cell phone communication takes place inside a hundred-mile radius of our homes and by a 2014 Pew Research and Rutgers University report revealing that social media actually stifles debate between people of differnet opinions.42 But the reality of the Web is probably even more selfie-centric than the MIT report suggests.

., epilogue, pp. 240–51. 38 Williams, “The Agony of Instagram.” 39 Rhiannon Lucy Coslett and Holly Baxter, “Smug Shots and Selfies: The Rise of Internet Self-Obsession,” Guardian, December 6, 2013. 40 Nicholas Carr, “Is Google Making Us Stupid?,” Atlantic, July/August 2008. Also see Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York; Norton, 2011). 41 Eli Pariser, The Filter Bubble: What the Internet Is Hiding From You (Penguin, 2011). See also my June 2011 TechCrunchTV interview with Eli Pariser: Andrew Keen, “Keen On . . . Eli Pariser: Have Progressives Lost Faith in the Internet?,” TechCrunch, June 15, 2011, techcrunch.com/2011/06/15/keen-on-eli-pariser-have-progressives-lost-faith-in-the-internet-tctv. 42 Claire Carter, “Global Village of Technology a Myth as Study Shows Most Online Communication Limited to 100-Mile Radius,” BBC, December 18, 2013; Claire Cain Miller, “How Social Media Silences Debate,” New York Times, August 26, 2014. 43 Josh Constine, “The Data Factory—How Your Free Labor Lets Tech Giants Grow the Wealth Gap.” 44 Derek Thompson, “Google’s CEO: ‘The Laws Are Written by Lobbyists,’” Atlantic, October 1, 2010. 45 James Surowiecki, “Gross Domestic Freebie,” New Yorker, November 25, 2013. 46 Monica Anderson, “At Newspapers, Photographers Feel the Brunt of Job Cuts,” Pew Research Center, November 11, 2013. 47 Robert Reich, “Robert Reich: WhatsApp Is Everything Wrong with the U.S.


pages: 422 words: 104,457

Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillance by Julia Angwin

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AltaVista, Ayatollah Khomeini, barriers to entry, bitcoin, Chelsea Manning, clean water, crowdsourcing, cuban missile crisis, data is the new oil, David Graeber, Debian, Edward Snowden, Filter Bubble, Firefox, GnuPG, Google Chrome, Google Glasses, informal economy, Jacob Appelbaum, Julian Assange, market bubble, market design, medical residency, meta analysis, meta-analysis, mutually assured destruction, prediction markets, price discrimination, randomized controlled trial, RFID, Robert Shiller, Ronald Reagan, security theater, Silicon Valley, Silicon Valley startup, Skype, smart meter, Steven Levy, Upton Sinclair, WikiLeaks, Y2K, Zimmermann PGP

CV 12-1365-PHX- PGR (2012), http://ftc.gov/os/caselist/1023142/120809wyndhamcmpt.pdf. It claimed the FTC was unfairly penalizing: Brent Kendall, “FTC Fires Back in Cybersecurity Case,” Wall Street Journal, Law Blog, May 24, 2013, http://blogs.wsj.com/law/2013/05/24/ftc-fires-back-in-cybersecurity-case/. I call this type of mass customization: Internet activist Eli Pariser calls this phenomenon the “filter bubble.” Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011). Consider this: searching for a traditionally black-sounding name: Latanya Sweeney, “Discrimination in Online Ad Delivery” (Working Paper Series, Harvard University, Cambridge, Massachusetts, January 28, 2013), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2208240. In the months leading up to the November 2012 presidential election: Julia Angwin, “On Google, a Political Mystery That’s All Numbers,” Wall Street Journal, November 4, 2012, http://online.wsj.com/article/SB10001424052970203347104578099122530080836.html.


pages: 525 words: 116,295

The New Digital Age: Transforming Nations, Businesses, and Our Lives by Eric Schmidt, Jared Cohen

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, access to a mobile phone, additive manufacturing, airport security, Amazon Mechanical Turk, Amazon Web Services, anti-communist, augmented reality, Ayatollah Khomeini, barriers to entry, bitcoin, borderless world, call centre, Chelsea Manning, citizen journalism, clean water, cloud computing, crowdsourcing, data acquisition, Dean Kamen, Elon Musk, failed state, fear of failure, Filter Bubble, Google Earth, Google Glasses, hive mind, income inequality, information trail, invention of the printing press, job automation, Julian Assange, Khan Academy, Kickstarter, knowledge economy, Law of Accelerating Returns, market fundamentalism, means of production, mobile money, mutually assured destruction, Naomi Klein, offshore financial centre, peer-to-peer lending, personalized medicine, Peter Singer: altruism, Ray Kurzweil, RFID, self-driving car, sentiment analysis, Silicon Valley, Skype, Snapchat, social graph, speech recognition, Steve Jobs, Steven Pinker, Stewart Brand, Stuxnet, The Wisdom of Crowds, upwardly mobile, Whole Earth Catalog, WikiLeaks, young professional, zero day

: A New Yorker cartoon by Tom Cheney in 2012 expressed a similar idea. Its caption read “The Cloud Ate My Homework.” See “Cartoons from the Issue,” New Yorker, October 8, 2012, http://www.newyorker.com/humor/issuecartoons/2012/10/08/cartoons_20121001#slide=5. CHAPTER 2 THE FUTURE OF IDENTITY, CITIZENSHIP AND REPORTING While many worry about the phenomenon of confirmation bias: Eli Pariser describes this as a “filter bubble” in his book The Filter Bubble: What the Internet Is Hiding from You (New York: Penguin Press, 2011). a recent Ohio State University study: R. Kelly Garrett and Paul Resnick, “Resisting Political Fragmentation on the Internet,” Daedalus 140, no. 4 (Fall 2011): 108–120, doi:10.1162/DAED_a_00118. famously dissected how ethnically popular names: Steven D. Levitt and Stephen J. Dubner, Freakonomics: A Rogue Economist Explores the Hidden Side of Everything (New York: William Morrow, 2005); their study showed that the names were not the cause of a child’s success or failure, but a symptom of other indicators (particularly socioeconomic ones) that do influence a child’s chances.


pages: 396 words: 117,149

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World by Pedro Domingos

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Albert Einstein, Amazon Mechanical Turk, Arthur Eddington, Benoit Mandelbrot, bioinformatics, Black Swan, Brownian motion, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer vision, constrained optimization, correlation does not imply causation, crowdsourcing, Danny Hillis, data is the new oil, double helix, Douglas Hofstadter, Erik Brynjolfsson, experimental subject, Filter Bubble, future of work, global village, Google Glasses, Gödel, Escher, Bach, information retrieval, job automation, John Snow's cholera map, John von Neumann, Joseph Schumpeter, Kevin Kelly, lone genius, mandelbrot fractal, Mark Zuckerberg, Moneyball by Michael Lewis explains big data, Narrative Science, Nate Silver, natural language processing, Netflix Prize, Network effects, NP-complete, P = NP, PageRank, pattern recognition, phenotype, planetary scale, pre–internet, random walk, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman, Second Machine Age, self-driving car, Silicon Valley, speech recognition, statistical model, Stephen Hawking, Steven Levy, Steven Pinker, superintelligent machines, the scientific method, The Signal and the Noise by Nate Silver, theory of mind, transaction costs, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, white flight

Your bot’s job is to see through their claims, just as you see through TV commercials, but at a much finer level of detail, one that you’d never have the time or patience for. Before you buy a car, the digital you will go over every one of its specs, discuss them with the manufacturer, and study everything anyone in the world has said about that car and its alternatives. Your digital half will be like power steering for your life: it goes where you want to go but with less effort from you. This does not mean that you’ll end up in a “filter bubble,” seeing only what you reliably like, with no room for the unexpected; the digital you knows better than that. Part of its brief is to leave some things open to chance, to expose you to new experiences, and to look for serendipity. Even more interesting, the process doesn’t end when you find a car, a house, a doctor, a date, or a job. Your digital half is continually learning from its experiences, just as you would.

See also Cancer drugs Duhigg, Charles, 223 Dynamic programming, 220 Eastwood, Clint, 65 Echolocation, 26, 299 Eddington, Arthur, 75 Effect, law of, 218 eHarmony, 265 Eigenfaces, 215 80/20 rule, 43 Einstein, Albert, 75, 200 Eldredge, Niles, 127 Electronic circuits, genetic programming and, 133–134 Eliza (help desk), 198 EM (expectation maximization) algorithm, 209–210 Emotions, learning and, 218 Empathy-eliciting robots, 285 Empiricists, 57–58 Employment, effect of machine learning on, 276–279 Enlightenment, rationalism vs. empiricism, 58 Entropy, 87 Epinions, 231 Equations, 4, 50 Essay on Population (Malthus), 178, 235 Ethics, robot armies and, 280–281 Eugene Onegin (Pushkin), 153–154 “Explaining away” phenomenon, 163 Evaluation learning algorithms and, 283 Markov logic networks and, 249 Master Algorithm and, 239, 241, 243 Evolution, 28–29, 121–142 Baldwinian, 139 Darwin’s algorithm, 122–128 human-directed, 286–289, 311 Master Algorithm and, 28–29 of robots, 121–122, 137, 303 role of sex in, 134–137 technological, 136–137 See also Genetic algorithms Evolutionaries, 51, 52, 54 Alchemy and, 252–253 exploration-exploitation dilemma, 128–130, 221 further reading, 303–304 genetic programming and, 52 Holland and, 127 Master Algorithm and, 240–241 nature and, 137–139 Evolutionary computation, 121–142 Evolutionary robotics, 121–122, 303 Exclusive-OR function (XOR), 100–101, 112, 195 Exploration-exploitation dilemma, 128–130, 221 Exponential function, machine learning and, 73–74 The Extended Phenotype (Dawkins), 284 Facebook, 44, 291 data and, 14, 274 facial recognition technology, 179–180 machine learning and, 11 relational learning and, 230 sharing via, 271–272 Facial identification, 179–180, 182 False discovery rate, 77, 301 Farming, as analogy for machine learning, 6–7 Feature selection, 188–189 Feature template, 248 Feature weighting, 189 Ferret brain rewiring, 26, 299 Feynman, Richard, 4 Filter bubble, 270 Filtering spam, rule for, 125–127 First principal component of the data, 214 Fisher, Ronald, 122 Fitness Fisher on, 122 in genetic programming, 132 Master Algorithm and, 243 neural learning and, 138–139 sex and, 135 Fitness function, 123–124 Fitness maximum, genetic algorithms and, 127–128, 129 Fix, Evelyn, 178–179, 186 Fodor, Jerry, 38 Forecasting, S curves and, 106 Foundation Medicine, 41, 261 Foundation (Asimov), 232 Fractal geometry, 30, 300 Freakonomics (Dubner & Levitt), 275 Frequentist interpretation of probability, 149 Freund, Yoav, 238 Friedman, Milton, 151 Frontiers, 185, 187, 191, 196 “Funes the Memorious” (Borges), 71 Futility of bias-free learning, 64 FuturICT project, 258 Galileo, 14, 72 Galois, Évariste, 200 Game theory, machine learning and, 20 Gaming, reinforcement learning and, 222 Gates, Bill, 22, 55, 152 GECCO (Genetic and Evolutionary Computing Conference), 136 Gene expression microarrays, 84–85 Generalizations, choosing, 60, 61 Generative model, Bayesian network as, 159 Gene regulation, Bayesian networks and, 159 Genetic algorithms, 122–128 Alchemy and, 252 backpropagation vs., 128 building blocks and, 128–129, 134 schemas, 129 survival of the fittest programs, 131–134 The Genetical Theory of Natural Selection (Fisher), 122 Genetic programming, 52, 131–133, 240, 244, 245, 252, 303–304 sex and, 134–137 Genetic Programming (Koza), 136 Genetic search, 241, 243, 249 Genome, poverty of, 27 Gentner, Dedre, 199 Ghani, Rayid, 17 The Ghost Map (Johnson), 182–183 Gibson, William, 289 Gift economy, 279 Gleevec, 84 Global Alliance for Genomics and Health, 261 Gödel, Escher, Bach (Hofstadter), 200 Good, I.


pages: 397 words: 110,130

Smarter Than You Think: How Technology Is Changing Our Minds for the Better by Clive Thompson

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, 4chan, A Declaration of the Independence of Cyberspace, augmented reality, barriers to entry, Benjamin Mako Hill, butterfly effect, citizen journalism, Claude Shannon: information theory, conceptual framework, corporate governance, crowdsourcing, Deng Xiaoping, discovery of penicillin, Douglas Engelbart, Edward Glaeser, en.wikipedia.org, experimental subject, Filter Bubble, Freestyle chess, Galaxy Zoo, Google Earth, Google Glasses, Henri Poincaré, hindsight bias, hive mind, Howard Rheingold, information retrieval, iterative process, jimmy wales, Kevin Kelly, Khan Academy, knowledge worker, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Netflix Prize, Nicholas Carr, patent troll, pattern recognition, pre–internet, Richard Feynman, Richard Feynman, Ronald Coase, Ronald Reagan, sentiment analysis, Silicon Valley, Skype, Snapchat, Socratic dialogue, spaced repetition, telepresence, telepresence robot, The Nature of the Firm, the scientific method, The Wisdom of Crowds, theory of mind, transaction costs, Vannevar Bush, Watson beat the top human players on Jeopardy!, WikiLeaks, X Prize, éminence grise

Like most of our new digital tools, crafting a good set of weak links takes work. If we don’t engage in that sort of work, it has repercussions. It’s easier to lean into homophily, connecting online to people who are demographically similar: the same age, class, ethnicity and race, even the same profession. Homophily is deeply embedded in our psychology, and as Eli Pariser adroitly points out in The Filter Bubble, digital tools can make homophily worse, narrowing our worldview. For example, Facebook’s news feed analyzes which contacts you most pay attention to and highlights their updates in your “top stories” feed, so you’re liable to hear more and more often from the same small set of people. (Worse, as I’ve discovered, it seems to drop from view the people whom you almost never check in on—which means your weakest ties gradually vanish from sight.)

what Malcolm Gladwell called connectors: Malcolm Gladwell, The Tipping Point: How Little Things Can Make a Big Difference (New York: Little, Brown, 2000), 38–41. Peter Diamandis, the head of the X Prize Foundation: Peter H. Diamandis, “Instant Gratification,” in Is the Internet Changing the Way You Think?: The Net’s Impact on Our Minds and Future, ed. John Brockman (New York: HarperCollins, 2011), 214. Facebook’s news feed analyzes: Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (New York, Penguin, 2011), 37–38, 217–43. people who are heavily socially active online: Lee Rainie and Barry Wellman, Networked: The New Social Operating System (Cambridge, MA: MIT Press, 2012), Kindle edition. Consider the case of Maureen Evans: Some of my writing here appeared in “Clive Thompson in Praise of Obscurity,” Wired, February 2010, accessed March 26, 2013, www.wired.com/magazine/2010/01/st_thompson_obscurity/.


pages: 278 words: 70,416

Smartcuts: How Hackers, Innovators, and Icons Accelerate Success by Shane Snow

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Airbnb, Albert Einstein, attribution theory, augmented reality, barriers to entry, conceptual framework, correlation does not imply causation, deliberate practice, Elon Musk, Fellow of the Royal Society, Filter Bubble, Google X / Alphabet X, hive mind, index card, index fund, Isaac Newton, job satisfaction, Khan Academy, Law of Accelerating Returns, Lean Startup, Mahatma Gandhi, meta analysis, meta-analysis, pattern recognition, Peter Thiel, popular electronics, Ray Kurzweil, Richard Florida, Ronald Reagan, Saturday Night Live, self-driving car, side project, Silicon Valley, Steve Jobs

At the time I checked the Facebook statistics, the story had received 12 shares; however that count may increase as more people discover it. 53 the top story on the hugely popular blog: The BuzzFeed article about ’90s side characters is by Dave Stopera, “20 Supporting Characters from ’90s TV Shows Then and Now,” BuzzFeed, March 27, 2012, http://www.buzzfeed.com/daves4/20-supporting-actors-from-90s-tv-shows-then-and-n (accessed May 27, 2013). I actually did laugh at the Olmec reference. 54 a mellow, unshaven author: Eli Pariser is author of a fascinating book about one of the darker effects of the “personalized” Internet, The Filter Bubble: What the Internet Is Hiding from You (Penguin Press, 2011). 54 including Facebook cofounder: Upworthy’s investor information can be found at CrunchBase, Upworthy, http://www.crunchbase.com/company/upworthy (accessed February 15, 2014). 55 the week after Upworthy launched: The baby meerkats and other disappointed animals can be found at Jack Shepherd, “33 Animals Who Are Extremely Disappointed in You,” BuzzFeed, April 10, 2012, http://www.buzzfeed.com/expresident/animals-who-are-extremely-disappointed-in-you (accessed May 27, 2013). 56 The little comedy theater: You can learn everything you want about Kelly Leonard, executive director of The Second City, and the school itself at The Second City, https://www.secondcity.com.

Raw Data Is an Oxymoron by Lisa Gitelman

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

collateralized debt obligation, computer age, continuous integration, crowdsourcing, Drosophila, Edmond Halley, Filter Bubble, Firefox, Google Earth, Howard Rheingold, index card, informal economy, Isaac Newton, Johann Wolfgang von Goethe, knowledge worker, Louis Daguerre, Menlo Park, optical character recognition, RFID, Richard Thaler, Silicon Valley, social graph, software studies, statistical model, Stephen Hawking, Steven Pinker, text mining, time value of money, trade route, Turing machine, urban renewal, Vannevar Bush

Does the person from whom data originated have claims over it once it enters into circulation on the “data exchange”? Will data follow the model of genetic materials, with data becoming the intellectual property of a data broker who had altered it in some fashion? Proposed policy solutions thus far include improved securitization, transparency and informed consent, expiration dates and storage limits, and the regulation of data centers. 28. On the era of personalization, see Eli Pariser, The Filter Bubble:What the Internet Is Hiding from You (New York: Penguin Books, 2011). 29. Kevin D. Haggerty and Richard V. Ericson, The New Politics of Surveillance and Visibility (Toronto: University of Toronto Press, 2006), 4. Dataveillance and Countervailance 30. Matthew Fuller, Media Ecologies: Materialist Energies in Art and Technoculture (Cambridge, MA: MIT Press, 2005), 149. 31. Tiziana Terranova, Network Culture: Politics for the Information Age (London: Pluto Press, 2004), 34. 32.


pages: 339 words: 88,732

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies by Erik Brynjolfsson, Andrew McAfee

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, 3D printing, access to a mobile phone, additive manufacturing, Airbnb, Albert Einstein, Amazon Mechanical Turk, Amazon Web Services, American Society of Civil Engineers: Report Card, Any sufficiently advanced technology is indistinguishable from magic, autonomous vehicles, barriers to entry, Baxter: Rethink Robotics, British Empire, business intelligence, business process, call centre, clean water, combinatorial explosion, computer age, computer vision, congestion charging, corporate governance, crowdsourcing, David Ricardo: comparative advantage, employer provided health coverage, en.wikipedia.org, Erik Brynjolfsson, factory automation, falling living standards, Filter Bubble, first square of the chessboard / second half of the chessboard, Frank Levy and Richard Murnane: The New Division of Labor, Freestyle chess, full employment, game design, global village, happiness index / gross national happiness, illegal immigration, immigration reform, income inequality, income per capita, indoor plumbing, industrial robot, informal economy, inventory management, James Watt: steam engine, Jeff Bezos, jimmy wales, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kevin Kelly, Khan Academy, knowledge worker, Kodak vs Instagram, law of one price, low skilled workers, Lyft, Mahatma Gandhi, manufacturing employment, Mark Zuckerberg, Mars Rover, means of production, Narrative Science, Nate Silver, natural language processing, Network effects, new economy, New Urbanism, Nicholas Carr, Occupy movement, oil shale / tar sands, oil shock, pattern recognition, payday loans, price stability, Productivity paradox, profit maximization, Ralph Nader, Ray Kurzweil, recommendation engine, Report Card for America’s Infrastructure, Robert Gordon, Rodney Brooks, Ronald Reagan, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Simon Kuznets, six sigma, Skype, software patent, sovereign wealth fund, speech recognition, statistical model, Steve Jobs, Steven Pinker, Stuxnet, supply-chain management, TaskRabbit, technological singularity, telepresence, The Bell Curve by Richard Herrnstein and Charles Murray, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, total factor productivity, transaction costs, Tyler Cowen: Great Stagnation, Vernor Vinge, Watson beat the top human players on Jeopardy!, winner-take-all economy, Y2K

See Kris Wetterstrand, “DNA Sequencing Costs: Data from the NHGRI Genome Sequencing Program (GSP),” National Human Genome Research Institute, July 16, 2013, http://www.genome.gov/sequencingcosts/. 5. On gaming, see Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York: W. W. Norton & Company, 2011); on cyberbalkanization, see Marshall van Alstyne and Erik Brynjolfsson, “Electronic Communities: Global Villages or Cyberbalkanization?” ICIS 1996 Proceedings, December 31, 1996, http://aisel.aisnet.org/icis1996/5; and Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (New York: Penguin, 2012); on social isolation see Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other (New York: Basic Books, 2012); and Robert D. Putnam, Bowling Alone: The Collapse and Revival of American Community, 1st ed. (New York: Simon & Schuster, 2001); finally, on environmental degradation, see Albert Gore, The Future: Six Drivers of Global Change, 2013. 6.


pages: 284 words: 92,688

Disrupted: My Misadventure in the Start-Up Bubble by Dan Lyons

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Airbnb, Bernie Madoff, bitcoin, call centre, cleantech, cloud computing, corporate governance, dumpster diving, fear of failure, Filter Bubble, Golden Gate Park, Google Glasses, Googley, Gordon Gekko, hiring and firing, Jeff Bezos, Lean Startup, Lyft, Mark Zuckerberg, Menlo Park, minimum viable product, new economy, Paul Graham, pre–internet, quantitative easing, ride hailing / ride sharing, Rosa Parks, Sand Hill Road, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, Snapchat, software as a service, South of Market, San Francisco, Steve Ballmer, Steve Jobs, Steve Wozniak, telemarketer, tulip mania, Y Combinator, éminence grise

Instead, like Hollywood, or Wall Street, Silicon Valley has become a metaphorical name for an industry, one that exists in Los Angeles, Seattle, New York, Boston, and countless other places, as well as the San Francisco Bay Area. The term bubble, as I use it, refers not only to the economic bubble in which the valuation of some tech start-ups went crazy but also to the mindset of the people working inside technology companies, the true believers and Kool-Aid drinkers, the people who live inside their own filter bubble, brimming with self-confidence and self-regard, impervious to criticism, immunized against reality, unaware of how ridiculous they appear to the outside world. HubSpot, where I worked from April 2013 to December 2014, was part of that bubble. In November 2014, the company floated a successful IPO, and it now has a market value of nearly $2 billion. But this book is about more than HubSpot. This is a story about what it’s like to try to reinvent yourself and start a new career in your fifties, particularly in an industry that is by and large hostile to older workers.


pages: 349 words: 95,972

Messy: The Power of Disorder to Transform Our Lives by Tim Harford

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

affirmative action, Air France Flight 447, Airbnb, airport security, Albert Einstein, Amazon Mechanical Turk, Amazon Web Services, Atul Gawande, autonomous vehicles, banking crisis, Barry Marshall: ulcers, Basel III, Berlin Wall, British Empire, Broken windows theory, call centre, Cass Sunstein, Chris Urmson, cloud computing, collateralized debt obligation, crowdsourcing, deindustrialization, Donald Trump, Erdős number, experimental subject, Ferguson, Missouri, Filter Bubble, Frank Gehry, game design, global supply chain, Googley, Guggenheim Bilbao, high net worth, Inbox Zero, income inequality, Internet of things, Jane Jacobs, Jeff Bezos, Loebner Prize, Louis Pasteur, Mark Zuckerberg, Menlo Park, Merlin Mann, microbiome, out of africa, Paul Erdős, Richard Thaler, Rosa Parks, self-driving car, side project, Silicon Valley, Silicon Valley startup, Skype, Steve Jobs, Steven Levy, Stewart Brand, telemarketer, the built environment, The Death and Life of Great American Cities, Turing test, urban decay

(Many of the tweets made false claims, which were rapidly retweeted.)31 Pierson’s analysis showed that the two groups, with very different views of the world, barely interacted.32 From the middle of one of these groups, surrounded by outrage expressed by like-minded people, it is easy to believe that the world agrees with you. Of course the Internet is full of contrary viewpoints that might challenge our assumptions and encourage us to think more deeply, but few of us realize that we might have to get out and look for those viewpoints. In the words of author and digital activist Eli Pariser, a “filter bubble” exists to give us more of what we already believe. It is sometimes hard to see that bubble for what it is. When our stream of social media updates fits tidily into our preconceptions, we are hardly likely to mess it up by seeking out the people who disagree. The pattern repeats endlessly: we gain new choices about whom to listen to, whom to trust, and whom to befriend—and we use those new choices to tighten the circle around us to people who are more and more like us.


pages: 538 words: 121,670

Republic, Lost: How Money Corrupts Congress--And a Plan to Stop It by Lawrence Lessig

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

asset-backed security, banking crisis, carried interest, cognitive dissonance, corporate personhood, correlation does not imply causation, crony capitalism, David Brooks, Edward Glaeser, Filter Bubble, financial deregulation, financial innovation, financial intermediation, invisible hand, jimmy wales, Martin Wolf, meta analysis, meta-analysis, Mikhail Gorbachev, moral hazard, place-making, profit maximization, Ralph Nader, regulatory arbitrage, rent-seeking, Ronald Reagan, Silicon Valley, single-payer health, The Wealth of Nations by Adam Smith, too big to fail, upwardly mobile, WikiLeaks, Zipcar

I don’t pretend to offer any solution to bad faith, though as I emphasize in “Against Transparency” (New Republic, Oct. 9, 2009), the most obvious solution is to eliminate the suggestion that there may be a conflict. 27. Florence T. Bourgeois, Srinivas Murthy, and Kenneth D. Mandl, “Outcome Reporting Among Drug Trials Registered in ClinicalTrials.gov,” Annals of Internal Medicine 153 no. 3 (Aug. 3, 2010): 158–66, 159, available at link #14. 28. Eli Pariser, The Filter Bubble: What the Internet Is Hiding from You (forthcoming, New York: Penguin Press, 2011), 28. 29. Top 1000 Sites—DoubleClick Ad Planner, available at link #15. The $150 million is calculated as follows: $1 per thousand page views, an estimated fourteen billion page views per month, times twelve months is at least $150 million. 30. Interview with author, May 4, 2007. 31. “Therefore I Travel, Company Profile of Lonely Planet,” Tony Wheeler, Lonely Planet, available at link #16. 32.


pages: 742 words: 137,937

The Future of the Professions: How Technology Will Transform the Work of Human Experts by Richard Susskind, Daniel Susskind

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, 3D printing, additive manufacturing, AI winter, Albert Einstein, Amazon Mechanical Turk, Amazon Web Services, Andrew Keen, Atul Gawande, Automated Insights, autonomous vehicles, Big bang: deregulation of the City of London, big data - Walmart - Pop Tarts, Bill Joy: nanobots, business process, business process outsourcing, Cass Sunstein, Checklist Manifesto, Clapham omnibus, Clayton Christensen, clean water, cloud computing, computer age, computer vision, conceptual framework, corporate governance, crowdsourcing, Daniel Kahneman / Amos Tversky, death of newspapers, disintermediation, Douglas Hofstadter, en.wikipedia.org, Erik Brynjolfsson, Filter Bubble, Frank Levy and Richard Murnane: The New Division of Labor, full employment, future of work, Google Glasses, Google X / Alphabet X, Hacker Ethic, industrial robot, informal economy, information retrieval, interchangeable parts, Internet of things, Isaac Newton, James Hargreaves, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Khan Academy, knowledge economy, lump of labour, Marshall McLuhan, Narrative Science, natural language processing, Network effects, optical character recognition, personalized medicine, pre–internet, Ray Kurzweil, Richard Feynman, Richard Feynman, Second Machine Age, self-driving car, semantic web, Skype, social web, speech recognition, spinning jenny, strong AI, supply-chain management, telepresence, the market place, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, transaction costs, Turing test, Watson beat the top human players on Jeopardy!, young professional

Oxman, Neri, Jorge Duro-Royo, Steven Keating, Ben Peters, and Elizabeth Tsai, ‘Towards Robotic Swarm Printing’, Architectural Design, 84: 3 (2014), 108–15. Palfrey, John, and Urs Gasser, Born Digital (New York: Basic Books, 2008). Paliwala, Abdul (ed.), A History of Legal Informatics (Zaragoza: Prensas Universitarias de Zaragoza, 2010). Panel on Fair Access to the Professions, Unleashing Aspiration (London: Cabinet Office, 2009). Parfit, Derek, Reasons and Persons (Oxford: Clarendon Press, 1987). Pariser, Eli, The Filter Bubble (London: Penguin Books, 2012). Parsons, Matthew, Effective Knowledge Management for Law Firms (New York: Oxford University Press, 2004). Parsons, Talcott, ‘The Professions and Social Structure’, Social Forces, 17: 4 (1939), 457–67. Parsons, Talcott, The Social System (New York: Free Press, 1951). Parsons, Talcott, Essays in Sociological Theory (New York: Free Press, 1964). Parsons, Talcott, The Structure of Social Action, 2 vols., paperback edn.